Nov 29 00:35:33 np0005539510 kernel: Linux version 5.14.0-642.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-68.el9) #1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025
Nov 29 00:35:33 np0005539510 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Nov 29 00:35:33 np0005539510 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 29 00:35:33 np0005539510 kernel: BIOS-provided physical RAM map:
Nov 29 00:35:33 np0005539510 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Nov 29 00:35:33 np0005539510 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Nov 29 00:35:33 np0005539510 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Nov 29 00:35:33 np0005539510 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Nov 29 00:35:33 np0005539510 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Nov 29 00:35:33 np0005539510 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Nov 29 00:35:33 np0005539510 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Nov 29 00:35:33 np0005539510 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Nov 29 00:35:33 np0005539510 kernel: NX (Execute Disable) protection: active
Nov 29 00:35:33 np0005539510 kernel: APIC: Static calls initialized
Nov 29 00:35:33 np0005539510 kernel: SMBIOS 2.8 present.
Nov 29 00:35:33 np0005539510 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Nov 29 00:35:33 np0005539510 kernel: Hypervisor detected: KVM
Nov 29 00:35:33 np0005539510 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Nov 29 00:35:33 np0005539510 kernel: kvm-clock: using sched offset of 3260837303 cycles
Nov 29 00:35:33 np0005539510 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Nov 29 00:35:33 np0005539510 kernel: tsc: Detected 2800.000 MHz processor
Nov 29 00:35:33 np0005539510 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Nov 29 00:35:33 np0005539510 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Nov 29 00:35:33 np0005539510 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Nov 29 00:35:33 np0005539510 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Nov 29 00:35:33 np0005539510 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Nov 29 00:35:33 np0005539510 kernel: Using GB pages for direct mapping
Nov 29 00:35:33 np0005539510 kernel: RAMDISK: [mem 0x2d83a000-0x32c14fff]
Nov 29 00:35:33 np0005539510 kernel: ACPI: Early table checksum verification disabled
Nov 29 00:35:33 np0005539510 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Nov 29 00:35:33 np0005539510 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 00:35:33 np0005539510 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 00:35:33 np0005539510 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 00:35:33 np0005539510 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Nov 29 00:35:33 np0005539510 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 00:35:33 np0005539510 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 00:35:33 np0005539510 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Nov 29 00:35:33 np0005539510 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Nov 29 00:35:33 np0005539510 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Nov 29 00:35:33 np0005539510 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Nov 29 00:35:33 np0005539510 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Nov 29 00:35:33 np0005539510 kernel: No NUMA configuration found
Nov 29 00:35:33 np0005539510 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Nov 29 00:35:33 np0005539510 kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Nov 29 00:35:33 np0005539510 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Nov 29 00:35:33 np0005539510 kernel: Zone ranges:
Nov 29 00:35:33 np0005539510 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Nov 29 00:35:33 np0005539510 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Nov 29 00:35:33 np0005539510 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Nov 29 00:35:33 np0005539510 kernel:  Device   empty
Nov 29 00:35:33 np0005539510 kernel: Movable zone start for each node
Nov 29 00:35:33 np0005539510 kernel: Early memory node ranges
Nov 29 00:35:33 np0005539510 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Nov 29 00:35:33 np0005539510 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Nov 29 00:35:33 np0005539510 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Nov 29 00:35:33 np0005539510 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Nov 29 00:35:33 np0005539510 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Nov 29 00:35:33 np0005539510 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Nov 29 00:35:33 np0005539510 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Nov 29 00:35:33 np0005539510 kernel: ACPI: PM-Timer IO Port: 0x608
Nov 29 00:35:33 np0005539510 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Nov 29 00:35:33 np0005539510 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Nov 29 00:35:33 np0005539510 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Nov 29 00:35:33 np0005539510 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Nov 29 00:35:33 np0005539510 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Nov 29 00:35:33 np0005539510 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Nov 29 00:35:33 np0005539510 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Nov 29 00:35:33 np0005539510 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Nov 29 00:35:33 np0005539510 kernel: TSC deadline timer available
Nov 29 00:35:33 np0005539510 kernel: CPU topo: Max. logical packages:   8
Nov 29 00:35:33 np0005539510 kernel: CPU topo: Max. logical dies:       8
Nov 29 00:35:33 np0005539510 kernel: CPU topo: Max. dies per package:   1
Nov 29 00:35:33 np0005539510 kernel: CPU topo: Max. threads per core:   1
Nov 29 00:35:33 np0005539510 kernel: CPU topo: Num. cores per package:     1
Nov 29 00:35:33 np0005539510 kernel: CPU topo: Num. threads per package:   1
Nov 29 00:35:33 np0005539510 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Nov 29 00:35:33 np0005539510 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Nov 29 00:35:33 np0005539510 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Nov 29 00:35:33 np0005539510 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Nov 29 00:35:33 np0005539510 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Nov 29 00:35:33 np0005539510 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Nov 29 00:35:33 np0005539510 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Nov 29 00:35:33 np0005539510 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Nov 29 00:35:33 np0005539510 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Nov 29 00:35:33 np0005539510 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Nov 29 00:35:33 np0005539510 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Nov 29 00:35:33 np0005539510 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Nov 29 00:35:33 np0005539510 kernel: Booting paravirtualized kernel on KVM
Nov 29 00:35:33 np0005539510 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Nov 29 00:35:33 np0005539510 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Nov 29 00:35:33 np0005539510 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Nov 29 00:35:33 np0005539510 kernel: kvm-guest: PV spinlocks disabled, no host support
Nov 29 00:35:33 np0005539510 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 29 00:35:33 np0005539510 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64", will be passed to user space.
Nov 29 00:35:33 np0005539510 kernel: random: crng init done
Nov 29 00:35:33 np0005539510 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Nov 29 00:35:33 np0005539510 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Nov 29 00:35:33 np0005539510 kernel: Fallback order for Node 0: 0 
Nov 29 00:35:33 np0005539510 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Nov 29 00:35:33 np0005539510 kernel: Policy zone: Normal
Nov 29 00:35:33 np0005539510 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Nov 29 00:35:33 np0005539510 kernel: software IO TLB: area num 8.
Nov 29 00:35:33 np0005539510 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Nov 29 00:35:33 np0005539510 kernel: ftrace: allocating 49313 entries in 193 pages
Nov 29 00:35:33 np0005539510 kernel: ftrace: allocated 193 pages with 3 groups
Nov 29 00:35:33 np0005539510 kernel: Dynamic Preempt: voluntary
Nov 29 00:35:33 np0005539510 kernel: rcu: Preemptible hierarchical RCU implementation.
Nov 29 00:35:33 np0005539510 kernel: rcu: #011RCU event tracing is enabled.
Nov 29 00:35:33 np0005539510 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Nov 29 00:35:33 np0005539510 kernel: #011Trampoline variant of Tasks RCU enabled.
Nov 29 00:35:33 np0005539510 kernel: #011Rude variant of Tasks RCU enabled.
Nov 29 00:35:33 np0005539510 kernel: #011Tracing variant of Tasks RCU enabled.
Nov 29 00:35:33 np0005539510 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Nov 29 00:35:33 np0005539510 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Nov 29 00:35:33 np0005539510 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 29 00:35:33 np0005539510 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 29 00:35:33 np0005539510 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 29 00:35:33 np0005539510 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Nov 29 00:35:33 np0005539510 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Nov 29 00:35:33 np0005539510 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Nov 29 00:35:33 np0005539510 kernel: Console: colour VGA+ 80x25
Nov 29 00:35:33 np0005539510 kernel: printk: console [ttyS0] enabled
Nov 29 00:35:33 np0005539510 kernel: ACPI: Core revision 20230331
Nov 29 00:35:33 np0005539510 kernel: APIC: Switch to symmetric I/O mode setup
Nov 29 00:35:33 np0005539510 kernel: x2apic enabled
Nov 29 00:35:33 np0005539510 kernel: APIC: Switched APIC routing to: physical x2apic
Nov 29 00:35:33 np0005539510 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Nov 29 00:35:33 np0005539510 kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Nov 29 00:35:33 np0005539510 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Nov 29 00:35:33 np0005539510 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Nov 29 00:35:33 np0005539510 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Nov 29 00:35:33 np0005539510 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Nov 29 00:35:33 np0005539510 kernel: Spectre V2 : Mitigation: Retpolines
Nov 29 00:35:33 np0005539510 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Nov 29 00:35:33 np0005539510 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Nov 29 00:35:33 np0005539510 kernel: RETBleed: Mitigation: untrained return thunk
Nov 29 00:35:33 np0005539510 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Nov 29 00:35:33 np0005539510 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Nov 29 00:35:33 np0005539510 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Nov 29 00:35:33 np0005539510 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Nov 29 00:35:33 np0005539510 kernel: x86/bugs: return thunk changed
Nov 29 00:35:33 np0005539510 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Nov 29 00:35:33 np0005539510 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Nov 29 00:35:33 np0005539510 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Nov 29 00:35:33 np0005539510 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Nov 29 00:35:33 np0005539510 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Nov 29 00:35:33 np0005539510 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Nov 29 00:35:33 np0005539510 kernel: Freeing SMP alternatives memory: 40K
Nov 29 00:35:33 np0005539510 kernel: pid_max: default: 32768 minimum: 301
Nov 29 00:35:33 np0005539510 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Nov 29 00:35:33 np0005539510 kernel: landlock: Up and running.
Nov 29 00:35:33 np0005539510 kernel: Yama: becoming mindful.
Nov 29 00:35:33 np0005539510 kernel: SELinux:  Initializing.
Nov 29 00:35:33 np0005539510 kernel: LSM support for eBPF active
Nov 29 00:35:33 np0005539510 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 29 00:35:33 np0005539510 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 29 00:35:33 np0005539510 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Nov 29 00:35:33 np0005539510 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Nov 29 00:35:33 np0005539510 kernel: ... version:                0
Nov 29 00:35:33 np0005539510 kernel: ... bit width:              48
Nov 29 00:35:33 np0005539510 kernel: ... generic registers:      6
Nov 29 00:35:33 np0005539510 kernel: ... value mask:             0000ffffffffffff
Nov 29 00:35:33 np0005539510 kernel: ... max period:             00007fffffffffff
Nov 29 00:35:33 np0005539510 kernel: ... fixed-purpose events:   0
Nov 29 00:35:33 np0005539510 kernel: ... event mask:             000000000000003f
Nov 29 00:35:33 np0005539510 kernel: signal: max sigframe size: 1776
Nov 29 00:35:33 np0005539510 kernel: rcu: Hierarchical SRCU implementation.
Nov 29 00:35:33 np0005539510 kernel: rcu: #011Max phase no-delay instances is 400.
Nov 29 00:35:33 np0005539510 kernel: smp: Bringing up secondary CPUs ...
Nov 29 00:35:33 np0005539510 kernel: smpboot: x86: Booting SMP configuration:
Nov 29 00:35:33 np0005539510 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Nov 29 00:35:33 np0005539510 kernel: smp: Brought up 1 node, 8 CPUs
Nov 29 00:35:33 np0005539510 kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Nov 29 00:35:33 np0005539510 kernel: node 0 deferred pages initialised in 11ms
Nov 29 00:35:33 np0005539510 kernel: Memory: 7766056K/8388068K available (16384K kernel code, 5787K rwdata, 13900K rodata, 4192K init, 7172K bss, 616272K reserved, 0K cma-reserved)
Nov 29 00:35:33 np0005539510 kernel: devtmpfs: initialized
Nov 29 00:35:33 np0005539510 kernel: x86/mm: Memory block size: 128MB
Nov 29 00:35:33 np0005539510 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Nov 29 00:35:33 np0005539510 kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Nov 29 00:35:33 np0005539510 kernel: pinctrl core: initialized pinctrl subsystem
Nov 29 00:35:33 np0005539510 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Nov 29 00:35:33 np0005539510 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Nov 29 00:35:33 np0005539510 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Nov 29 00:35:33 np0005539510 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Nov 29 00:35:33 np0005539510 kernel: audit: initializing netlink subsys (disabled)
Nov 29 00:35:33 np0005539510 kernel: audit: type=2000 audit(1764394531.482:1): state=initialized audit_enabled=0 res=1
Nov 29 00:35:33 np0005539510 kernel: thermal_sys: Registered thermal governor 'fair_share'
Nov 29 00:35:33 np0005539510 kernel: thermal_sys: Registered thermal governor 'step_wise'
Nov 29 00:35:33 np0005539510 kernel: thermal_sys: Registered thermal governor 'user_space'
Nov 29 00:35:33 np0005539510 kernel: cpuidle: using governor menu
Nov 29 00:35:33 np0005539510 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Nov 29 00:35:33 np0005539510 kernel: PCI: Using configuration type 1 for base access
Nov 29 00:35:33 np0005539510 kernel: PCI: Using configuration type 1 for extended access
Nov 29 00:35:33 np0005539510 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Nov 29 00:35:33 np0005539510 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Nov 29 00:35:33 np0005539510 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Nov 29 00:35:33 np0005539510 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Nov 29 00:35:33 np0005539510 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Nov 29 00:35:33 np0005539510 kernel: Demotion targets for Node 0: null
Nov 29 00:35:33 np0005539510 kernel: cryptd: max_cpu_qlen set to 1000
Nov 29 00:35:33 np0005539510 kernel: ACPI: Added _OSI(Module Device)
Nov 29 00:35:33 np0005539510 kernel: ACPI: Added _OSI(Processor Device)
Nov 29 00:35:33 np0005539510 kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Nov 29 00:35:33 np0005539510 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Nov 29 00:35:33 np0005539510 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Nov 29 00:35:33 np0005539510 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Nov 29 00:35:33 np0005539510 kernel: ACPI: Interpreter enabled
Nov 29 00:35:33 np0005539510 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Nov 29 00:35:33 np0005539510 kernel: ACPI: Using IOAPIC for interrupt routing
Nov 29 00:35:33 np0005539510 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Nov 29 00:35:33 np0005539510 kernel: PCI: Using E820 reservations for host bridge windows
Nov 29 00:35:33 np0005539510 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Nov 29 00:35:33 np0005539510 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Nov 29 00:35:33 np0005539510 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Nov 29 00:35:33 np0005539510 kernel: acpiphp: Slot [3] registered
Nov 29 00:35:33 np0005539510 kernel: acpiphp: Slot [4] registered
Nov 29 00:35:33 np0005539510 kernel: acpiphp: Slot [5] registered
Nov 29 00:35:33 np0005539510 kernel: acpiphp: Slot [6] registered
Nov 29 00:35:33 np0005539510 kernel: acpiphp: Slot [7] registered
Nov 29 00:35:33 np0005539510 kernel: acpiphp: Slot [8] registered
Nov 29 00:35:33 np0005539510 kernel: acpiphp: Slot [9] registered
Nov 29 00:35:33 np0005539510 kernel: acpiphp: Slot [10] registered
Nov 29 00:35:33 np0005539510 kernel: acpiphp: Slot [11] registered
Nov 29 00:35:33 np0005539510 kernel: acpiphp: Slot [12] registered
Nov 29 00:35:33 np0005539510 kernel: acpiphp: Slot [13] registered
Nov 29 00:35:33 np0005539510 kernel: acpiphp: Slot [14] registered
Nov 29 00:35:33 np0005539510 kernel: acpiphp: Slot [15] registered
Nov 29 00:35:33 np0005539510 kernel: acpiphp: Slot [16] registered
Nov 29 00:35:33 np0005539510 kernel: acpiphp: Slot [17] registered
Nov 29 00:35:33 np0005539510 kernel: acpiphp: Slot [18] registered
Nov 29 00:35:33 np0005539510 kernel: acpiphp: Slot [19] registered
Nov 29 00:35:33 np0005539510 kernel: acpiphp: Slot [20] registered
Nov 29 00:35:33 np0005539510 kernel: acpiphp: Slot [21] registered
Nov 29 00:35:33 np0005539510 kernel: acpiphp: Slot [22] registered
Nov 29 00:35:33 np0005539510 kernel: acpiphp: Slot [23] registered
Nov 29 00:35:33 np0005539510 kernel: acpiphp: Slot [24] registered
Nov 29 00:35:33 np0005539510 kernel: acpiphp: Slot [25] registered
Nov 29 00:35:33 np0005539510 kernel: acpiphp: Slot [26] registered
Nov 29 00:35:33 np0005539510 kernel: acpiphp: Slot [27] registered
Nov 29 00:35:33 np0005539510 kernel: acpiphp: Slot [28] registered
Nov 29 00:35:33 np0005539510 kernel: acpiphp: Slot [29] registered
Nov 29 00:35:33 np0005539510 kernel: acpiphp: Slot [30] registered
Nov 29 00:35:33 np0005539510 kernel: acpiphp: Slot [31] registered
Nov 29 00:35:33 np0005539510 kernel: PCI host bridge to bus 0000:00
Nov 29 00:35:33 np0005539510 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Nov 29 00:35:33 np0005539510 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Nov 29 00:35:33 np0005539510 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Nov 29 00:35:33 np0005539510 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Nov 29 00:35:33 np0005539510 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Nov 29 00:35:33 np0005539510 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Nov 29 00:35:33 np0005539510 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Nov 29 00:35:33 np0005539510 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Nov 29 00:35:33 np0005539510 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Nov 29 00:35:33 np0005539510 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Nov 29 00:35:33 np0005539510 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Nov 29 00:35:33 np0005539510 kernel: iommu: Default domain type: Translated
Nov 29 00:35:33 np0005539510 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Nov 29 00:35:33 np0005539510 kernel: SCSI subsystem initialized
Nov 29 00:35:33 np0005539510 kernel: ACPI: bus type USB registered
Nov 29 00:35:33 np0005539510 kernel: usbcore: registered new interface driver usbfs
Nov 29 00:35:33 np0005539510 kernel: usbcore: registered new interface driver hub
Nov 29 00:35:33 np0005539510 kernel: usbcore: registered new device driver usb
Nov 29 00:35:33 np0005539510 kernel: pps_core: LinuxPPS API ver. 1 registered
Nov 29 00:35:33 np0005539510 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Nov 29 00:35:33 np0005539510 kernel: PTP clock support registered
Nov 29 00:35:33 np0005539510 kernel: EDAC MC: Ver: 3.0.0
Nov 29 00:35:33 np0005539510 kernel: NetLabel: Initializing
Nov 29 00:35:33 np0005539510 kernel: NetLabel:  domain hash size = 128
Nov 29 00:35:33 np0005539510 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Nov 29 00:35:33 np0005539510 kernel: NetLabel:  unlabeled traffic allowed by default
Nov 29 00:35:33 np0005539510 kernel: PCI: Using ACPI for IRQ routing
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Nov 29 00:35:33 np0005539510 kernel: vgaarb: loaded
Nov 29 00:35:33 np0005539510 kernel: clocksource: Switched to clocksource kvm-clock
Nov 29 00:35:33 np0005539510 kernel: VFS: Disk quotas dquot_6.6.0
Nov 29 00:35:33 np0005539510 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Nov 29 00:35:33 np0005539510 kernel: pnp: PnP ACPI init
Nov 29 00:35:33 np0005539510 kernel: pnp: PnP ACPI: found 5 devices
Nov 29 00:35:33 np0005539510 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Nov 29 00:35:33 np0005539510 kernel: NET: Registered PF_INET protocol family
Nov 29 00:35:33 np0005539510 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Nov 29 00:35:33 np0005539510 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Nov 29 00:35:33 np0005539510 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Nov 29 00:35:33 np0005539510 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Nov 29 00:35:33 np0005539510 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Nov 29 00:35:33 np0005539510 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Nov 29 00:35:33 np0005539510 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Nov 29 00:35:33 np0005539510 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 29 00:35:33 np0005539510 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 29 00:35:33 np0005539510 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Nov 29 00:35:33 np0005539510 kernel: NET: Registered PF_XDP protocol family
Nov 29 00:35:33 np0005539510 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Nov 29 00:35:33 np0005539510 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Nov 29 00:35:33 np0005539510 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Nov 29 00:35:33 np0005539510 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Nov 29 00:35:33 np0005539510 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Nov 29 00:35:33 np0005539510 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Nov 29 00:35:33 np0005539510 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 71681 usecs
Nov 29 00:35:33 np0005539510 kernel: PCI: CLS 0 bytes, default 64
Nov 29 00:35:33 np0005539510 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Nov 29 00:35:33 np0005539510 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Nov 29 00:35:33 np0005539510 kernel: ACPI: bus type thunderbolt registered
Nov 29 00:35:33 np0005539510 kernel: Trying to unpack rootfs image as initramfs...
Nov 29 00:35:33 np0005539510 kernel: Initialise system trusted keyrings
Nov 29 00:35:33 np0005539510 kernel: Key type blacklist registered
Nov 29 00:35:33 np0005539510 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Nov 29 00:35:33 np0005539510 kernel: zbud: loaded
Nov 29 00:35:33 np0005539510 kernel: integrity: Platform Keyring initialized
Nov 29 00:35:33 np0005539510 kernel: integrity: Machine keyring initialized
Nov 29 00:35:33 np0005539510 kernel: Freeing initrd memory: 85868K
Nov 29 00:35:33 np0005539510 kernel: NET: Registered PF_ALG protocol family
Nov 29 00:35:33 np0005539510 kernel: xor: automatically using best checksumming function   avx       
Nov 29 00:35:33 np0005539510 kernel: Key type asymmetric registered
Nov 29 00:35:33 np0005539510 kernel: Asymmetric key parser 'x509' registered
Nov 29 00:35:33 np0005539510 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Nov 29 00:35:33 np0005539510 kernel: io scheduler mq-deadline registered
Nov 29 00:35:33 np0005539510 kernel: io scheduler kyber registered
Nov 29 00:35:33 np0005539510 kernel: io scheduler bfq registered
Nov 29 00:35:33 np0005539510 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Nov 29 00:35:33 np0005539510 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Nov 29 00:35:33 np0005539510 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Nov 29 00:35:33 np0005539510 kernel: ACPI: button: Power Button [PWRF]
Nov 29 00:35:33 np0005539510 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Nov 29 00:35:33 np0005539510 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Nov 29 00:35:33 np0005539510 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Nov 29 00:35:33 np0005539510 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Nov 29 00:35:33 np0005539510 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Nov 29 00:35:33 np0005539510 kernel: Non-volatile memory driver v1.3
Nov 29 00:35:33 np0005539510 kernel: rdac: device handler registered
Nov 29 00:35:33 np0005539510 kernel: hp_sw: device handler registered
Nov 29 00:35:33 np0005539510 kernel: emc: device handler registered
Nov 29 00:35:33 np0005539510 kernel: alua: device handler registered
Nov 29 00:35:33 np0005539510 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Nov 29 00:35:33 np0005539510 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Nov 29 00:35:33 np0005539510 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Nov 29 00:35:33 np0005539510 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Nov 29 00:35:33 np0005539510 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Nov 29 00:35:33 np0005539510 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Nov 29 00:35:33 np0005539510 kernel: usb usb1: Product: UHCI Host Controller
Nov 29 00:35:33 np0005539510 kernel: usb usb1: Manufacturer: Linux 5.14.0-642.el9.x86_64 uhci_hcd
Nov 29 00:35:33 np0005539510 kernel: usb usb1: SerialNumber: 0000:00:01.2
Nov 29 00:35:33 np0005539510 kernel: hub 1-0:1.0: USB hub found
Nov 29 00:35:33 np0005539510 kernel: hub 1-0:1.0: 2 ports detected
Nov 29 00:35:33 np0005539510 kernel: usbcore: registered new interface driver usbserial_generic
Nov 29 00:35:33 np0005539510 kernel: usbserial: USB Serial support registered for generic
Nov 29 00:35:33 np0005539510 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Nov 29 00:35:33 np0005539510 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Nov 29 00:35:33 np0005539510 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Nov 29 00:35:33 np0005539510 kernel: mousedev: PS/2 mouse device common for all mice
Nov 29 00:35:33 np0005539510 kernel: rtc_cmos 00:04: RTC can wake from S4
Nov 29 00:35:33 np0005539510 kernel: rtc_cmos 00:04: registered as rtc0
Nov 29 00:35:33 np0005539510 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Nov 29 00:35:33 np0005539510 kernel: rtc_cmos 00:04: setting system clock to 2025-11-29T05:35:32 UTC (1764394532)
Nov 29 00:35:33 np0005539510 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Nov 29 00:35:33 np0005539510 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Nov 29 00:35:33 np0005539510 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Nov 29 00:35:33 np0005539510 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Nov 29 00:35:33 np0005539510 kernel: hid: raw HID events driver (C) Jiri Kosina
Nov 29 00:35:33 np0005539510 kernel: usbcore: registered new interface driver usbhid
Nov 29 00:35:33 np0005539510 kernel: usbhid: USB HID core driver
Nov 29 00:35:33 np0005539510 kernel: drop_monitor: Initializing network drop monitor service
Nov 29 00:35:33 np0005539510 kernel: Initializing XFRM netlink socket
Nov 29 00:35:33 np0005539510 kernel: NET: Registered PF_INET6 protocol family
Nov 29 00:35:33 np0005539510 kernel: Segment Routing with IPv6
Nov 29 00:35:33 np0005539510 kernel: NET: Registered PF_PACKET protocol family
Nov 29 00:35:33 np0005539510 kernel: mpls_gso: MPLS GSO support
Nov 29 00:35:33 np0005539510 kernel: IPI shorthand broadcast: enabled
Nov 29 00:35:33 np0005539510 kernel: AVX2 version of gcm_enc/dec engaged.
Nov 29 00:35:33 np0005539510 kernel: AES CTR mode by8 optimization enabled
Nov 29 00:35:33 np0005539510 kernel: sched_clock: Marking stable (1236008419, 144514150)->(1454172619, -73650050)
Nov 29 00:35:33 np0005539510 kernel: registered taskstats version 1
Nov 29 00:35:33 np0005539510 kernel: Loading compiled-in X.509 certificates
Nov 29 00:35:33 np0005539510 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Nov 29 00:35:33 np0005539510 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Nov 29 00:35:33 np0005539510 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Nov 29 00:35:33 np0005539510 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Nov 29 00:35:33 np0005539510 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Nov 29 00:35:33 np0005539510 kernel: Demotion targets for Node 0: null
Nov 29 00:35:33 np0005539510 kernel: page_owner is disabled
Nov 29 00:35:33 np0005539510 kernel: Key type .fscrypt registered
Nov 29 00:35:33 np0005539510 kernel: Key type fscrypt-provisioning registered
Nov 29 00:35:33 np0005539510 kernel: Key type big_key registered
Nov 29 00:35:33 np0005539510 kernel: Key type encrypted registered
Nov 29 00:35:33 np0005539510 kernel: ima: No TPM chip found, activating TPM-bypass!
Nov 29 00:35:33 np0005539510 kernel: Loading compiled-in module X.509 certificates
Nov 29 00:35:33 np0005539510 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Nov 29 00:35:33 np0005539510 kernel: ima: Allocated hash algorithm: sha256
Nov 29 00:35:33 np0005539510 kernel: ima: No architecture policies found
Nov 29 00:35:33 np0005539510 kernel: evm: Initialising EVM extended attributes:
Nov 29 00:35:33 np0005539510 kernel: evm: security.selinux
Nov 29 00:35:33 np0005539510 kernel: evm: security.SMACK64 (disabled)
Nov 29 00:35:33 np0005539510 kernel: evm: security.SMACK64EXEC (disabled)
Nov 29 00:35:33 np0005539510 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Nov 29 00:35:33 np0005539510 kernel: evm: security.SMACK64MMAP (disabled)
Nov 29 00:35:33 np0005539510 kernel: evm: security.apparmor (disabled)
Nov 29 00:35:33 np0005539510 kernel: evm: security.ima
Nov 29 00:35:33 np0005539510 kernel: evm: security.capability
Nov 29 00:35:33 np0005539510 kernel: evm: HMAC attrs: 0x1
Nov 29 00:35:33 np0005539510 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Nov 29 00:35:33 np0005539510 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Nov 29 00:35:33 np0005539510 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Nov 29 00:35:33 np0005539510 kernel: usb 1-1: Product: QEMU USB Tablet
Nov 29 00:35:33 np0005539510 kernel: usb 1-1: Manufacturer: QEMU
Nov 29 00:35:33 np0005539510 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Nov 29 00:35:33 np0005539510 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Nov 29 00:35:33 np0005539510 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Nov 29 00:35:33 np0005539510 kernel: Running certificate verification RSA selftest
Nov 29 00:35:33 np0005539510 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Nov 29 00:35:33 np0005539510 kernel: Running certificate verification ECDSA selftest
Nov 29 00:35:33 np0005539510 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Nov 29 00:35:33 np0005539510 kernel: clk: Disabling unused clocks
Nov 29 00:35:33 np0005539510 kernel: Freeing unused decrypted memory: 2028K
Nov 29 00:35:33 np0005539510 kernel: Freeing unused kernel image (initmem) memory: 4192K
Nov 29 00:35:33 np0005539510 kernel: Write protecting the kernel read-only data: 30720k
Nov 29 00:35:33 np0005539510 kernel: Freeing unused kernel image (rodata/data gap) memory: 436K
Nov 29 00:35:33 np0005539510 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Nov 29 00:35:33 np0005539510 kernel: Run /init as init process
Nov 29 00:35:33 np0005539510 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 29 00:35:33 np0005539510 systemd: Detected virtualization kvm.
Nov 29 00:35:33 np0005539510 systemd: Detected architecture x86-64.
Nov 29 00:35:33 np0005539510 systemd: Running in initrd.
Nov 29 00:35:33 np0005539510 systemd: No hostname configured, using default hostname.
Nov 29 00:35:33 np0005539510 systemd: Hostname set to <localhost>.
Nov 29 00:35:33 np0005539510 systemd: Initializing machine ID from VM UUID.
Nov 29 00:35:33 np0005539510 systemd: Queued start job for default target Initrd Default Target.
Nov 29 00:35:33 np0005539510 systemd: Started Dispatch Password Requests to Console Directory Watch.
Nov 29 00:35:33 np0005539510 systemd: Reached target Local Encrypted Volumes.
Nov 29 00:35:33 np0005539510 systemd: Reached target Initrd /usr File System.
Nov 29 00:35:33 np0005539510 systemd: Reached target Local File Systems.
Nov 29 00:35:33 np0005539510 systemd: Reached target Path Units.
Nov 29 00:35:33 np0005539510 systemd: Reached target Slice Units.
Nov 29 00:35:33 np0005539510 systemd: Reached target Swaps.
Nov 29 00:35:33 np0005539510 systemd: Reached target Timer Units.
Nov 29 00:35:33 np0005539510 systemd: Listening on D-Bus System Message Bus Socket.
Nov 29 00:35:33 np0005539510 systemd: Listening on Journal Socket (/dev/log).
Nov 29 00:35:33 np0005539510 systemd: Listening on Journal Socket.
Nov 29 00:35:33 np0005539510 systemd: Listening on udev Control Socket.
Nov 29 00:35:33 np0005539510 systemd: Listening on udev Kernel Socket.
Nov 29 00:35:33 np0005539510 systemd: Reached target Socket Units.
Nov 29 00:35:33 np0005539510 systemd: Starting Create List of Static Device Nodes...
Nov 29 00:35:33 np0005539510 systemd: Starting Journal Service...
Nov 29 00:35:33 np0005539510 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 29 00:35:33 np0005539510 systemd: Starting Apply Kernel Variables...
Nov 29 00:35:33 np0005539510 systemd: Starting Create System Users...
Nov 29 00:35:33 np0005539510 systemd: Starting Setup Virtual Console...
Nov 29 00:35:33 np0005539510 systemd: Finished Create List of Static Device Nodes.
Nov 29 00:35:33 np0005539510 systemd-journald[303]: Journal started
Nov 29 00:35:33 np0005539510 systemd-journald[303]: Runtime Journal (/run/log/journal/4a1784f42c5f4879a5f6acc886e56ebb) is 8.0M, max 153.6M, 145.6M free.
Nov 29 00:35:33 np0005539510 systemd: Started Journal Service.
Nov 29 00:35:33 np0005539510 systemd[1]: Finished Apply Kernel Variables.
Nov 29 00:35:33 np0005539510 systemd-sysusers[307]: Creating group 'users' with GID 100.
Nov 29 00:35:33 np0005539510 systemd-sysusers[307]: Creating group 'dbus' with GID 81.
Nov 29 00:35:33 np0005539510 systemd-sysusers[307]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Nov 29 00:35:33 np0005539510 systemd[1]: Finished Create System Users.
Nov 29 00:35:33 np0005539510 systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 29 00:35:33 np0005539510 systemd[1]: Starting Create Volatile Files and Directories...
Nov 29 00:35:33 np0005539510 systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 29 00:35:33 np0005539510 systemd[1]: Finished Create Volatile Files and Directories.
Nov 29 00:35:33 np0005539510 systemd[1]: Finished Setup Virtual Console.
Nov 29 00:35:33 np0005539510 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Nov 29 00:35:33 np0005539510 systemd[1]: Starting dracut cmdline hook...
Nov 29 00:35:33 np0005539510 dracut-cmdline[322]: dracut-9 dracut-057-102.git20250818.el9
Nov 29 00:35:33 np0005539510 dracut-cmdline[322]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 29 00:35:33 np0005539510 systemd[1]: Finished dracut cmdline hook.
Nov 29 00:35:33 np0005539510 systemd[1]: Starting dracut pre-udev hook...
Nov 29 00:35:33 np0005539510 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Nov 29 00:35:33 np0005539510 kernel: device-mapper: uevent: version 1.0.3
Nov 29 00:35:33 np0005539510 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Nov 29 00:35:33 np0005539510 kernel: RPC: Registered named UNIX socket transport module.
Nov 29 00:35:33 np0005539510 kernel: RPC: Registered udp transport module.
Nov 29 00:35:33 np0005539510 kernel: RPC: Registered tcp transport module.
Nov 29 00:35:33 np0005539510 kernel: RPC: Registered tcp-with-tls transport module.
Nov 29 00:35:33 np0005539510 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Nov 29 00:35:34 np0005539510 rpc.statd[440]: Version 2.5.4 starting
Nov 29 00:35:34 np0005539510 rpc.statd[440]: Initializing NSM state
Nov 29 00:35:34 np0005539510 rpc.idmapd[445]: Setting log level to 0
Nov 29 00:35:34 np0005539510 systemd[1]: Finished dracut pre-udev hook.
Nov 29 00:35:34 np0005539510 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 29 00:35:34 np0005539510 systemd-udevd[458]: Using default interface naming scheme 'rhel-9.0'.
Nov 29 00:35:34 np0005539510 systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 29 00:35:34 np0005539510 systemd[1]: Starting dracut pre-trigger hook...
Nov 29 00:35:34 np0005539510 systemd[1]: Finished dracut pre-trigger hook.
Nov 29 00:35:34 np0005539510 systemd[1]: Starting Coldplug All udev Devices...
Nov 29 00:35:34 np0005539510 systemd[1]: Created slice Slice /system/modprobe.
Nov 29 00:35:34 np0005539510 systemd[1]: Starting Load Kernel Module configfs...
Nov 29 00:35:34 np0005539510 systemd[1]: Finished Coldplug All udev Devices.
Nov 29 00:35:34 np0005539510 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 29 00:35:34 np0005539510 systemd[1]: Finished Load Kernel Module configfs.
Nov 29 00:35:34 np0005539510 systemd[1]: Mounting Kernel Configuration File System...
Nov 29 00:35:34 np0005539510 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 29 00:35:34 np0005539510 systemd[1]: Reached target Network.
Nov 29 00:35:34 np0005539510 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 29 00:35:34 np0005539510 systemd[1]: Starting dracut initqueue hook...
Nov 29 00:35:34 np0005539510 systemd[1]: Mounted Kernel Configuration File System.
Nov 29 00:35:34 np0005539510 systemd[1]: Reached target System Initialization.
Nov 29 00:35:34 np0005539510 systemd[1]: Reached target Basic System.
Nov 29 00:35:34 np0005539510 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Nov 29 00:35:34 np0005539510 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Nov 29 00:35:34 np0005539510 kernel: scsi host0: ata_piix
Nov 29 00:35:34 np0005539510 kernel: vda: vda1
Nov 29 00:35:34 np0005539510 kernel: scsi host1: ata_piix
Nov 29 00:35:34 np0005539510 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Nov 29 00:35:34 np0005539510 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Nov 29 00:35:34 np0005539510 kernel: ata1: found unknown device (class 0)
Nov 29 00:35:34 np0005539510 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Nov 29 00:35:34 np0005539510 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Nov 29 00:35:34 np0005539510 systemd-udevd[476]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 00:35:34 np0005539510 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Nov 29 00:35:34 np0005539510 systemd[1]: Found device /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253.
Nov 29 00:35:34 np0005539510 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Nov 29 00:35:34 np0005539510 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Nov 29 00:35:34 np0005539510 systemd[1]: Reached target Initrd Root Device.
Nov 29 00:35:34 np0005539510 systemd[1]: Finished dracut initqueue hook.
Nov 29 00:35:34 np0005539510 systemd[1]: Reached target Preparation for Remote File Systems.
Nov 29 00:35:34 np0005539510 systemd[1]: Reached target Remote Encrypted Volumes.
Nov 29 00:35:34 np0005539510 systemd[1]: Reached target Remote File Systems.
Nov 29 00:35:34 np0005539510 systemd[1]: Starting dracut pre-mount hook...
Nov 29 00:35:34 np0005539510 systemd[1]: Finished dracut pre-mount hook.
Nov 29 00:35:34 np0005539510 systemd[1]: Starting File System Check on /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253...
Nov 29 00:35:34 np0005539510 systemd-fsck[553]: /usr/sbin/fsck.xfs: XFS file system.
Nov 29 00:35:34 np0005539510 systemd[1]: Finished File System Check on /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253.
Nov 29 00:35:34 np0005539510 systemd[1]: Mounting /sysroot...
Nov 29 00:35:35 np0005539510 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Nov 29 00:35:35 np0005539510 kernel: XFS (vda1): Mounting V5 Filesystem b277050f-8ace-464d-abb6-4c46d4c45253
Nov 29 00:35:35 np0005539510 kernel: XFS (vda1): Ending clean mount
Nov 29 00:35:35 np0005539510 systemd[1]: Mounted /sysroot.
Nov 29 00:35:35 np0005539510 systemd[1]: Reached target Initrd Root File System.
Nov 29 00:35:35 np0005539510 systemd[1]: Starting Mountpoints Configured in the Real Root...
Nov 29 00:35:35 np0005539510 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Nov 29 00:35:35 np0005539510 systemd[1]: Finished Mountpoints Configured in the Real Root.
Nov 29 00:35:35 np0005539510 systemd[1]: Reached target Initrd File Systems.
Nov 29 00:35:35 np0005539510 systemd[1]: Reached target Initrd Default Target.
Nov 29 00:35:35 np0005539510 systemd[1]: Starting dracut mount hook...
Nov 29 00:35:35 np0005539510 systemd[1]: Finished dracut mount hook.
Nov 29 00:35:35 np0005539510 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Nov 29 00:35:35 np0005539510 rpc.idmapd[445]: exiting on signal 15
Nov 29 00:35:35 np0005539510 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Nov 29 00:35:35 np0005539510 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Nov 29 00:35:35 np0005539510 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped target Network.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped target Remote Encrypted Volumes.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped target Timer Units.
Nov 29 00:35:35 np0005539510 systemd[1]: dbus.socket: Deactivated successfully.
Nov 29 00:35:35 np0005539510 systemd[1]: Closed D-Bus System Message Bus Socket.
Nov 29 00:35:35 np0005539510 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped target Initrd Default Target.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped target Basic System.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped target Initrd Root Device.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped target Initrd /usr File System.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped target Path Units.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped target Remote File Systems.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped target Preparation for Remote File Systems.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped target Slice Units.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped target Socket Units.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped target System Initialization.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped target Local File Systems.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped target Swaps.
Nov 29 00:35:35 np0005539510 systemd[1]: dracut-mount.service: Deactivated successfully.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped dracut mount hook.
Nov 29 00:35:35 np0005539510 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped dracut pre-mount hook.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped target Local Encrypted Volumes.
Nov 29 00:35:35 np0005539510 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Nov 29 00:35:35 np0005539510 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped dracut initqueue hook.
Nov 29 00:35:35 np0005539510 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped Apply Kernel Variables.
Nov 29 00:35:35 np0005539510 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped Create Volatile Files and Directories.
Nov 29 00:35:35 np0005539510 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped Coldplug All udev Devices.
Nov 29 00:35:35 np0005539510 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped dracut pre-trigger hook.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Nov 29 00:35:35 np0005539510 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped Setup Virtual Console.
Nov 29 00:35:35 np0005539510 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Nov 29 00:35:35 np0005539510 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 29 00:35:35 np0005539510 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Nov 29 00:35:35 np0005539510 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Nov 29 00:35:35 np0005539510 systemd[1]: systemd-udevd.service: Deactivated successfully.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Nov 29 00:35:35 np0005539510 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Nov 29 00:35:35 np0005539510 systemd[1]: Closed udev Control Socket.
Nov 29 00:35:35 np0005539510 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Nov 29 00:35:35 np0005539510 systemd[1]: Closed udev Kernel Socket.
Nov 29 00:35:35 np0005539510 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped dracut pre-udev hook.
Nov 29 00:35:35 np0005539510 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped dracut cmdline hook.
Nov 29 00:35:35 np0005539510 systemd[1]: Starting Cleanup udev Database...
Nov 29 00:35:35 np0005539510 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped Create Static Device Nodes in /dev.
Nov 29 00:35:35 np0005539510 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped Create List of Static Device Nodes.
Nov 29 00:35:35 np0005539510 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Nov 29 00:35:35 np0005539510 systemd[1]: Stopped Create System Users.
Nov 29 00:35:35 np0005539510 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Nov 29 00:35:35 np0005539510 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Nov 29 00:35:35 np0005539510 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Nov 29 00:35:35 np0005539510 systemd[1]: Finished Cleanup udev Database.
Nov 29 00:35:35 np0005539510 systemd[1]: Reached target Switch Root.
Nov 29 00:35:35 np0005539510 systemd[1]: Starting Switch Root...
Nov 29 00:35:35 np0005539510 systemd[1]: Switching root.
Nov 29 00:35:35 np0005539510 systemd-journald[303]: Journal stopped
Nov 29 00:35:36 np0005539510 systemd-journald: Received SIGTERM from PID 1 (systemd).
Nov 29 00:35:36 np0005539510 kernel: audit: type=1404 audit(1764394535.826:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Nov 29 00:35:36 np0005539510 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 00:35:36 np0005539510 kernel: SELinux:  policy capability open_perms=1
Nov 29 00:35:36 np0005539510 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 00:35:36 np0005539510 kernel: SELinux:  policy capability always_check_network=0
Nov 29 00:35:36 np0005539510 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 00:35:36 np0005539510 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 00:35:36 np0005539510 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 00:35:36 np0005539510 kernel: audit: type=1403 audit(1764394535.949:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Nov 29 00:35:36 np0005539510 systemd: Successfully loaded SELinux policy in 125.420ms.
Nov 29 00:35:36 np0005539510 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 32.913ms.
Nov 29 00:35:36 np0005539510 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 29 00:35:36 np0005539510 systemd: Detected virtualization kvm.
Nov 29 00:35:36 np0005539510 systemd: Detected architecture x86-64.
Nov 29 00:35:36 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 00:35:36 np0005539510 systemd: initrd-switch-root.service: Deactivated successfully.
Nov 29 00:35:36 np0005539510 systemd: Stopped Switch Root.
Nov 29 00:35:36 np0005539510 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Nov 29 00:35:36 np0005539510 systemd: Created slice Slice /system/getty.
Nov 29 00:35:36 np0005539510 systemd: Created slice Slice /system/serial-getty.
Nov 29 00:35:36 np0005539510 systemd: Created slice Slice /system/sshd-keygen.
Nov 29 00:35:36 np0005539510 systemd: Created slice User and Session Slice.
Nov 29 00:35:36 np0005539510 systemd: Started Dispatch Password Requests to Console Directory Watch.
Nov 29 00:35:36 np0005539510 systemd: Started Forward Password Requests to Wall Directory Watch.
Nov 29 00:35:36 np0005539510 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Nov 29 00:35:36 np0005539510 systemd: Reached target Local Encrypted Volumes.
Nov 29 00:35:36 np0005539510 systemd: Stopped target Switch Root.
Nov 29 00:35:36 np0005539510 systemd: Stopped target Initrd File Systems.
Nov 29 00:35:36 np0005539510 systemd: Stopped target Initrd Root File System.
Nov 29 00:35:36 np0005539510 systemd: Reached target Local Integrity Protected Volumes.
Nov 29 00:35:36 np0005539510 systemd: Reached target Path Units.
Nov 29 00:35:36 np0005539510 systemd: Reached target rpc_pipefs.target.
Nov 29 00:35:36 np0005539510 systemd: Reached target Slice Units.
Nov 29 00:35:36 np0005539510 systemd: Reached target Swaps.
Nov 29 00:35:36 np0005539510 systemd: Reached target Local Verity Protected Volumes.
Nov 29 00:35:36 np0005539510 systemd: Listening on RPCbind Server Activation Socket.
Nov 29 00:35:36 np0005539510 systemd: Reached target RPC Port Mapper.
Nov 29 00:35:36 np0005539510 systemd: Listening on Process Core Dump Socket.
Nov 29 00:35:36 np0005539510 systemd: Listening on initctl Compatibility Named Pipe.
Nov 29 00:35:36 np0005539510 systemd: Listening on udev Control Socket.
Nov 29 00:35:36 np0005539510 systemd: Listening on udev Kernel Socket.
Nov 29 00:35:36 np0005539510 systemd: Mounting Huge Pages File System...
Nov 29 00:35:36 np0005539510 systemd: Mounting POSIX Message Queue File System...
Nov 29 00:35:36 np0005539510 systemd: Mounting Kernel Debug File System...
Nov 29 00:35:36 np0005539510 systemd: Mounting Kernel Trace File System...
Nov 29 00:35:36 np0005539510 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 29 00:35:36 np0005539510 systemd: Starting Create List of Static Device Nodes...
Nov 29 00:35:36 np0005539510 systemd: Starting Load Kernel Module configfs...
Nov 29 00:35:36 np0005539510 systemd: Starting Load Kernel Module drm...
Nov 29 00:35:36 np0005539510 systemd: Starting Load Kernel Module efi_pstore...
Nov 29 00:35:36 np0005539510 systemd: Starting Load Kernel Module fuse...
Nov 29 00:35:36 np0005539510 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Nov 29 00:35:36 np0005539510 systemd: systemd-fsck-root.service: Deactivated successfully.
Nov 29 00:35:36 np0005539510 systemd: Stopped File System Check on Root Device.
Nov 29 00:35:36 np0005539510 systemd: Stopped Journal Service.
Nov 29 00:35:36 np0005539510 systemd: Starting Journal Service...
Nov 29 00:35:36 np0005539510 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 29 00:35:36 np0005539510 systemd: Starting Generate network units from Kernel command line...
Nov 29 00:35:36 np0005539510 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 29 00:35:36 np0005539510 systemd: Starting Remount Root and Kernel File Systems...
Nov 29 00:35:36 np0005539510 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Nov 29 00:35:36 np0005539510 systemd: Starting Apply Kernel Variables...
Nov 29 00:35:36 np0005539510 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Nov 29 00:35:36 np0005539510 kernel: ACPI: bus type drm_connector registered
Nov 29 00:35:36 np0005539510 systemd: Starting Coldplug All udev Devices...
Nov 29 00:35:36 np0005539510 kernel: fuse: init (API version 7.37)
Nov 29 00:35:36 np0005539510 systemd-journald[676]: Journal started
Nov 29 00:35:36 np0005539510 systemd-journald[676]: Runtime Journal (/run/log/journal/1f988c78c563e12389ab342aced42dbb) is 8.0M, max 153.6M, 145.6M free.
Nov 29 00:35:36 np0005539510 systemd[1]: Queued start job for default target Multi-User System.
Nov 29 00:35:36 np0005539510 systemd[1]: systemd-journald.service: Deactivated successfully.
Nov 29 00:35:36 np0005539510 systemd: Mounted Huge Pages File System.
Nov 29 00:35:36 np0005539510 systemd: Started Journal Service.
Nov 29 00:35:36 np0005539510 systemd[1]: Mounted POSIX Message Queue File System.
Nov 29 00:35:36 np0005539510 systemd[1]: Mounted Kernel Debug File System.
Nov 29 00:35:36 np0005539510 systemd[1]: Mounted Kernel Trace File System.
Nov 29 00:35:36 np0005539510 systemd[1]: Finished Create List of Static Device Nodes.
Nov 29 00:35:36 np0005539510 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 29 00:35:36 np0005539510 systemd[1]: Finished Load Kernel Module configfs.
Nov 29 00:35:36 np0005539510 systemd[1]: modprobe@drm.service: Deactivated successfully.
Nov 29 00:35:36 np0005539510 systemd[1]: Finished Load Kernel Module drm.
Nov 29 00:35:36 np0005539510 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Nov 29 00:35:36 np0005539510 systemd[1]: Finished Load Kernel Module efi_pstore.
Nov 29 00:35:36 np0005539510 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Nov 29 00:35:36 np0005539510 systemd[1]: Finished Load Kernel Module fuse.
Nov 29 00:35:36 np0005539510 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Nov 29 00:35:36 np0005539510 systemd[1]: Finished Generate network units from Kernel command line.
Nov 29 00:35:36 np0005539510 systemd[1]: Finished Remount Root and Kernel File Systems.
Nov 29 00:35:36 np0005539510 systemd[1]: Finished Apply Kernel Variables.
Nov 29 00:35:36 np0005539510 systemd[1]: Mounting FUSE Control File System...
Nov 29 00:35:36 np0005539510 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 29 00:35:36 np0005539510 systemd[1]: Starting Rebuild Hardware Database...
Nov 29 00:35:36 np0005539510 systemd[1]: Starting Flush Journal to Persistent Storage...
Nov 29 00:35:36 np0005539510 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Nov 29 00:35:36 np0005539510 systemd[1]: Starting Load/Save OS Random Seed...
Nov 29 00:35:36 np0005539510 systemd[1]: Starting Create System Users...
Nov 29 00:35:36 np0005539510 systemd[1]: Mounted FUSE Control File System.
Nov 29 00:35:36 np0005539510 systemd-journald[676]: Runtime Journal (/run/log/journal/1f988c78c563e12389ab342aced42dbb) is 8.0M, max 153.6M, 145.6M free.
Nov 29 00:35:36 np0005539510 systemd-journald[676]: Received client request to flush runtime journal.
Nov 29 00:35:36 np0005539510 systemd[1]: Finished Flush Journal to Persistent Storage.
Nov 29 00:35:36 np0005539510 systemd[1]: Finished Load/Save OS Random Seed.
Nov 29 00:35:36 np0005539510 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 29 00:35:36 np0005539510 systemd[1]: Finished Coldplug All udev Devices.
Nov 29 00:35:36 np0005539510 systemd[1]: Finished Create System Users.
Nov 29 00:35:36 np0005539510 systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 29 00:35:36 np0005539510 systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 29 00:35:36 np0005539510 systemd[1]: Reached target Preparation for Local File Systems.
Nov 29 00:35:36 np0005539510 systemd[1]: Reached target Local File Systems.
Nov 29 00:35:36 np0005539510 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Nov 29 00:35:36 np0005539510 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Nov 29 00:35:36 np0005539510 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Nov 29 00:35:36 np0005539510 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Nov 29 00:35:36 np0005539510 systemd[1]: Starting Automatic Boot Loader Update...
Nov 29 00:35:36 np0005539510 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Nov 29 00:35:36 np0005539510 systemd[1]: Starting Create Volatile Files and Directories...
Nov 29 00:35:36 np0005539510 bootctl[694]: Couldn't find EFI system partition, skipping.
Nov 29 00:35:36 np0005539510 systemd[1]: Finished Automatic Boot Loader Update.
Nov 29 00:35:36 np0005539510 systemd[1]: Finished Create Volatile Files and Directories.
Nov 29 00:35:36 np0005539510 systemd[1]: Starting Security Auditing Service...
Nov 29 00:35:36 np0005539510 systemd[1]: Starting RPC Bind...
Nov 29 00:35:36 np0005539510 systemd[1]: Starting Rebuild Journal Catalog...
Nov 29 00:35:36 np0005539510 auditd[699]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Nov 29 00:35:36 np0005539510 auditd[699]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Nov 29 00:35:36 np0005539510 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Nov 29 00:35:36 np0005539510 systemd[1]: Finished Rebuild Journal Catalog.
Nov 29 00:35:36 np0005539510 augenrules[705]: /sbin/augenrules: No change
Nov 29 00:35:36 np0005539510 systemd[1]: Started RPC Bind.
Nov 29 00:35:36 np0005539510 augenrules[720]: No rules
Nov 29 00:35:36 np0005539510 augenrules[720]: enabled 1
Nov 29 00:35:36 np0005539510 augenrules[720]: failure 1
Nov 29 00:35:36 np0005539510 augenrules[720]: pid 699
Nov 29 00:35:36 np0005539510 augenrules[720]: rate_limit 0
Nov 29 00:35:36 np0005539510 augenrules[720]: backlog_limit 8192
Nov 29 00:35:36 np0005539510 augenrules[720]: lost 0
Nov 29 00:35:36 np0005539510 augenrules[720]: backlog 4
Nov 29 00:35:36 np0005539510 augenrules[720]: backlog_wait_time 60000
Nov 29 00:35:36 np0005539510 augenrules[720]: backlog_wait_time_actual 0
Nov 29 00:35:36 np0005539510 augenrules[720]: enabled 1
Nov 29 00:35:36 np0005539510 augenrules[720]: failure 1
Nov 29 00:35:36 np0005539510 augenrules[720]: pid 699
Nov 29 00:35:36 np0005539510 augenrules[720]: rate_limit 0
Nov 29 00:35:36 np0005539510 augenrules[720]: backlog_limit 8192
Nov 29 00:35:36 np0005539510 augenrules[720]: lost 0
Nov 29 00:35:36 np0005539510 augenrules[720]: backlog 1
Nov 29 00:35:36 np0005539510 augenrules[720]: backlog_wait_time 60000
Nov 29 00:35:36 np0005539510 augenrules[720]: backlog_wait_time_actual 0
Nov 29 00:35:36 np0005539510 augenrules[720]: enabled 1
Nov 29 00:35:36 np0005539510 augenrules[720]: failure 1
Nov 29 00:35:36 np0005539510 augenrules[720]: pid 699
Nov 29 00:35:36 np0005539510 augenrules[720]: rate_limit 0
Nov 29 00:35:36 np0005539510 augenrules[720]: backlog_limit 8192
Nov 29 00:35:36 np0005539510 augenrules[720]: lost 0
Nov 29 00:35:36 np0005539510 augenrules[720]: backlog 0
Nov 29 00:35:36 np0005539510 augenrules[720]: backlog_wait_time 60000
Nov 29 00:35:36 np0005539510 augenrules[720]: backlog_wait_time_actual 0
Nov 29 00:35:36 np0005539510 systemd[1]: Started Security Auditing Service.
Nov 29 00:35:36 np0005539510 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Nov 29 00:35:37 np0005539510 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Nov 29 00:35:37 np0005539510 systemd[1]: Finished Rebuild Hardware Database.
Nov 29 00:35:37 np0005539510 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 29 00:35:37 np0005539510 systemd[1]: Starting Update is Completed...
Nov 29 00:35:37 np0005539510 systemd[1]: Finished Update is Completed.
Nov 29 00:35:37 np0005539510 systemd-udevd[729]: Using default interface naming scheme 'rhel-9.0'.
Nov 29 00:35:37 np0005539510 systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 29 00:35:37 np0005539510 systemd[1]: Reached target System Initialization.
Nov 29 00:35:37 np0005539510 systemd[1]: Started dnf makecache --timer.
Nov 29 00:35:37 np0005539510 systemd[1]: Started Daily rotation of log files.
Nov 29 00:35:37 np0005539510 systemd[1]: Started Daily Cleanup of Temporary Directories.
Nov 29 00:35:37 np0005539510 systemd[1]: Reached target Timer Units.
Nov 29 00:35:37 np0005539510 systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 29 00:35:37 np0005539510 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Nov 29 00:35:37 np0005539510 systemd[1]: Reached target Socket Units.
Nov 29 00:35:37 np0005539510 systemd[1]: Starting D-Bus System Message Bus...
Nov 29 00:35:37 np0005539510 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 29 00:35:37 np0005539510 systemd[1]: Starting Load Kernel Module configfs...
Nov 29 00:35:37 np0005539510 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 29 00:35:37 np0005539510 systemd[1]: Finished Load Kernel Module configfs.
Nov 29 00:35:37 np0005539510 systemd-udevd[741]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 00:35:37 np0005539510 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Nov 29 00:35:37 np0005539510 systemd[1]: Started D-Bus System Message Bus.
Nov 29 00:35:37 np0005539510 systemd[1]: Reached target Basic System.
Nov 29 00:35:37 np0005539510 dbus-broker-lau[765]: Ready
Nov 29 00:35:37 np0005539510 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Nov 29 00:35:37 np0005539510 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Nov 29 00:35:37 np0005539510 systemd[1]: Starting NTP client/server...
Nov 29 00:35:37 np0005539510 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Nov 29 00:35:37 np0005539510 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Nov 29 00:35:37 np0005539510 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Nov 29 00:35:37 np0005539510 systemd[1]: Starting Restore /run/initramfs on shutdown...
Nov 29 00:35:37 np0005539510 systemd[1]: Starting IPv4 firewall with iptables...
Nov 29 00:35:37 np0005539510 systemd[1]: Started irqbalance daemon.
Nov 29 00:35:37 np0005539510 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Nov 29 00:35:37 np0005539510 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 00:35:37 np0005539510 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 00:35:37 np0005539510 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 00:35:37 np0005539510 systemd[1]: Reached target sshd-keygen.target.
Nov 29 00:35:37 np0005539510 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Nov 29 00:35:37 np0005539510 systemd[1]: Reached target User and Group Name Lookups.
Nov 29 00:35:37 np0005539510 systemd[1]: Starting User Login Management...
Nov 29 00:35:37 np0005539510 systemd[1]: Finished Restore /run/initramfs on shutdown.
Nov 29 00:35:37 np0005539510 chronyd[787]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 29 00:35:37 np0005539510 chronyd[787]: Loaded 0 symmetric keys
Nov 29 00:35:37 np0005539510 chronyd[787]: Using right/UTC timezone to obtain leap second data
Nov 29 00:35:37 np0005539510 chronyd[787]: Loaded seccomp filter (level 2)
Nov 29 00:35:37 np0005539510 systemd[1]: Started NTP client/server.
Nov 29 00:35:37 np0005539510 systemd-logind[784]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 29 00:35:37 np0005539510 systemd-logind[784]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 29 00:35:37 np0005539510 systemd-logind[784]: New seat seat0.
Nov 29 00:35:37 np0005539510 systemd[1]: Started User Login Management.
Nov 29 00:35:37 np0005539510 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Nov 29 00:35:37 np0005539510 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Nov 29 00:35:37 np0005539510 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Nov 29 00:35:37 np0005539510 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Nov 29 00:35:37 np0005539510 kernel: Console: switching to colour dummy device 80x25
Nov 29 00:35:37 np0005539510 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Nov 29 00:35:37 np0005539510 kernel: [drm] features: -context_init
Nov 29 00:35:37 np0005539510 kernel: [drm] number of scanouts: 1
Nov 29 00:35:37 np0005539510 kernel: [drm] number of cap sets: 0
Nov 29 00:35:37 np0005539510 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Nov 29 00:35:37 np0005539510 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Nov 29 00:35:37 np0005539510 kernel: Console: switching to colour frame buffer device 128x48
Nov 29 00:35:37 np0005539510 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Nov 29 00:35:37 np0005539510 kernel: kvm_amd: TSC scaling supported
Nov 29 00:35:37 np0005539510 kernel: kvm_amd: Nested Virtualization enabled
Nov 29 00:35:37 np0005539510 kernel: kvm_amd: Nested Paging enabled
Nov 29 00:35:37 np0005539510 kernel: kvm_amd: LBR virtualization supported
Nov 29 00:35:37 np0005539510 iptables.init[779]: iptables: Applying firewall rules: [  OK  ]
Nov 29 00:35:37 np0005539510 systemd[1]: Finished IPv4 firewall with iptables.
Nov 29 00:35:37 np0005539510 cloud-init[839]: Cloud-init v. 24.4-7.el9 running 'init-local' at Sat, 29 Nov 2025 05:35:37 +0000. Up 6.56 seconds.
Nov 29 00:35:38 np0005539510 systemd[1]: run-cloud\x2dinit-tmp-tmpqai37alp.mount: Deactivated successfully.
Nov 29 00:35:38 np0005539510 systemd[1]: Starting Hostname Service...
Nov 29 00:35:38 np0005539510 systemd[1]: Started Hostname Service.
Nov 29 00:35:38 np0005539510 systemd-hostnamed[853]: Hostname set to <np0005539510.novalocal> (static)
Nov 29 00:35:38 np0005539510 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Nov 29 00:35:38 np0005539510 systemd[1]: Reached target Preparation for Network.
Nov 29 00:35:38 np0005539510 systemd[1]: Starting Network Manager...
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.4849] NetworkManager (version 1.54.1-1.el9) is starting... (boot:631d2949-c1d4-4f67-afc4-db082a3ff43a)
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.4856] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.4943] manager[0x563c469b8080]: monitoring kernel firmware directory '/lib/firmware'.
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.4984] hostname: hostname: using hostnamed
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.4984] hostname: static hostname changed from (none) to "np0005539510.novalocal"
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.4989] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5160] manager[0x563c469b8080]: rfkill: Wi-Fi hardware radio set enabled
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5160] manager[0x563c469b8080]: rfkill: WWAN hardware radio set enabled
Nov 29 00:35:38 np0005539510 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5210] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5210] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5212] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5212] manager: Networking is enabled by state file
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5214] settings: Loaded settings plugin: keyfile (internal)
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5226] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5259] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5280] dhcp: init: Using DHCP client 'internal'
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5283] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5301] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5311] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5321] device (lo): Activation: starting connection 'lo' (5b6e73d6-4c36-495c-9d49-56d866cbd8e2)
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5333] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5337] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5376] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5384] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5386] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5390] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5392] device (eth0): carrier: link connected
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5394] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5405] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 29 00:35:38 np0005539510 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5417] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5441] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5443] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5445] manager: NetworkManager state is now CONNECTING
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5447] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5458] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5461] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 00:35:38 np0005539510 systemd[1]: Started Network Manager.
Nov 29 00:35:38 np0005539510 systemd[1]: Reached target Network.
Nov 29 00:35:38 np0005539510 systemd[1]: Starting Network Manager Wait Online...
Nov 29 00:35:38 np0005539510 systemd[1]: Starting GSSAPI Proxy Daemon...
Nov 29 00:35:38 np0005539510 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5672] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5676] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.5692] device (lo): Activation: successful, device activated.
Nov 29 00:35:38 np0005539510 systemd[1]: Started GSSAPI Proxy Daemon.
Nov 29 00:35:38 np0005539510 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 29 00:35:38 np0005539510 systemd[1]: Reached target NFS client services.
Nov 29 00:35:38 np0005539510 systemd[1]: Reached target Preparation for Remote File Systems.
Nov 29 00:35:38 np0005539510 systemd[1]: Reached target Remote File Systems.
Nov 29 00:35:38 np0005539510 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.7224] dhcp4 (eth0): state changed new lease, address=38.102.83.94
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.7237] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.7259] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.7309] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.7313] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.7320] manager: NetworkManager state is now CONNECTED_SITE
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.7327] device (eth0): Activation: successful, device activated.
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.7338] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 29 00:35:38 np0005539510 NetworkManager[857]: <info>  [1764394538.7344] manager: startup complete
Nov 29 00:35:38 np0005539510 systemd[1]: Finished Network Manager Wait Online.
Nov 29 00:35:38 np0005539510 systemd[1]: Starting Cloud-init: Network Stage...
Nov 29 00:35:39 np0005539510 cloud-init[920]: Cloud-init v. 24.4-7.el9 running 'init' at Sat, 29 Nov 2025 05:35:39 +0000. Up 7.72 seconds.
Nov 29 00:35:39 np0005539510 cloud-init[920]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Nov 29 00:35:39 np0005539510 cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 29 00:35:39 np0005539510 cloud-init[920]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Nov 29 00:35:39 np0005539510 cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 29 00:35:39 np0005539510 cloud-init[920]: ci-info: |  eth0  | True |         38.102.83.94         | 255.255.255.0 | global | fa:16:3e:62:80:6f |
Nov 29 00:35:39 np0005539510 cloud-init[920]: ci-info: |  eth0  | True | fe80::f816:3eff:fe62:806f/64 |       .       |  link  | fa:16:3e:62:80:6f |
Nov 29 00:35:39 np0005539510 cloud-init[920]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Nov 29 00:35:39 np0005539510 cloud-init[920]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Nov 29 00:35:39 np0005539510 cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 29 00:35:39 np0005539510 cloud-init[920]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Nov 29 00:35:39 np0005539510 cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 29 00:35:39 np0005539510 cloud-init[920]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Nov 29 00:35:39 np0005539510 cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 29 00:35:39 np0005539510 cloud-init[920]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Nov 29 00:35:39 np0005539510 cloud-init[920]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Nov 29 00:35:39 np0005539510 cloud-init[920]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Nov 29 00:35:39 np0005539510 cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 29 00:35:39 np0005539510 cloud-init[920]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Nov 29 00:35:39 np0005539510 cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 29 00:35:39 np0005539510 cloud-init[920]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Nov 29 00:35:39 np0005539510 cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 29 00:35:39 np0005539510 cloud-init[920]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Nov 29 00:35:39 np0005539510 cloud-init[920]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Nov 29 00:35:39 np0005539510 cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 29 00:35:40 np0005539510 cloud-init[920]: Generating public/private rsa key pair.
Nov 29 00:35:40 np0005539510 cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Nov 29 00:35:40 np0005539510 cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Nov 29 00:35:40 np0005539510 cloud-init[920]: The key fingerprint is:
Nov 29 00:35:40 np0005539510 cloud-init[920]: SHA256:RCpG0KO3RkDTAwx6sae5aoVt8m1SH/n2QdTs/YFRGOM root@np0005539510.novalocal
Nov 29 00:35:40 np0005539510 cloud-init[920]: The key's randomart image is:
Nov 29 00:35:40 np0005539510 cloud-init[920]: +---[RSA 3072]----+
Nov 29 00:35:40 np0005539510 cloud-init[920]: |.+*=.   .    oo. |
Nov 29 00:35:40 np0005539510 cloud-init[920]: |. o==  o    +.o  |
Nov 29 00:35:40 np0005539510 cloud-init[920]: |. oo+o. .  . E   |
Nov 29 00:35:40 np0005539510 cloud-init[920]: | ..=o. .  . . +  |
Nov 29 00:35:40 np0005539510 cloud-init[920]: |  =o .  S  . o o |
Nov 29 00:35:40 np0005539510 cloud-init[920]: | o =o. o  .     o|
Nov 29 00:35:40 np0005539510 cloud-init[920]: |  *.o . o  .    .|
Nov 29 00:35:40 np0005539510 cloud-init[920]: | o o o . o  .    |
Nov 29 00:35:40 np0005539510 cloud-init[920]: |o   o   . ..     |
Nov 29 00:35:40 np0005539510 cloud-init[920]: +----[SHA256]-----+
Nov 29 00:35:40 np0005539510 cloud-init[920]: Generating public/private ecdsa key pair.
Nov 29 00:35:40 np0005539510 cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Nov 29 00:35:40 np0005539510 cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Nov 29 00:35:40 np0005539510 cloud-init[920]: The key fingerprint is:
Nov 29 00:35:40 np0005539510 cloud-init[920]: SHA256:nG6jfywUu6AsvzEUvN5IQcVxqXbc/fw8Kbp0W8eiUhM root@np0005539510.novalocal
Nov 29 00:35:40 np0005539510 cloud-init[920]: The key's randomart image is:
Nov 29 00:35:40 np0005539510 cloud-init[920]: +---[ECDSA 256]---+
Nov 29 00:35:40 np0005539510 cloud-init[920]: |    .oo...       |
Nov 29 00:35:40 np0005539510 cloud-init[920]: |   o  ...        |
Nov 29 00:35:40 np0005539510 cloud-init[920]: |    +  o . .     |
Nov 29 00:35:40 np0005539510 cloud-init[920]: |     +o.+.. E    |
Nov 29 00:35:40 np0005539510 cloud-init[920]: |    +. .So   +   |
Nov 29 00:35:40 np0005539510 cloud-init[920]: |   + o..o   o o. |
Nov 29 00:35:40 np0005539510 cloud-init[920]: |   .=..o+o o o.++|
Nov 29 00:35:40 np0005539510 cloud-init[920]: |  . oo oo.= .oo++|
Nov 29 00:35:40 np0005539510 cloud-init[920]: |   oo....o ++.. .|
Nov 29 00:35:40 np0005539510 cloud-init[920]: +----[SHA256]-----+
Nov 29 00:35:40 np0005539510 cloud-init[920]: Generating public/private ed25519 key pair.
Nov 29 00:35:40 np0005539510 cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Nov 29 00:35:40 np0005539510 cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Nov 29 00:35:40 np0005539510 cloud-init[920]: The key fingerprint is:
Nov 29 00:35:40 np0005539510 cloud-init[920]: SHA256:mASfPWBs+dgCD81awWsnWl9vNcMu1QkamaXtCAzGVEw root@np0005539510.novalocal
Nov 29 00:35:40 np0005539510 cloud-init[920]: The key's randomart image is:
Nov 29 00:35:40 np0005539510 cloud-init[920]: +--[ED25519 256]--+
Nov 29 00:35:40 np0005539510 cloud-init[920]: |    .=**+E  +.   |
Nov 29 00:35:40 np0005539510 cloud-init[920]: |    o+X=o. +o.   |
Nov 29 00:35:40 np0005539510 cloud-init[920]: |     B+=oo .oo...|
Nov 29 00:35:40 np0005539510 cloud-init[920]: |    ..Oo+.o.o *..|
Nov 29 00:35:40 np0005539510 cloud-init[920]: |     +o=S. o = o |
Nov 29 00:35:40 np0005539510 cloud-init[920]: |    .   .   + .  |
Nov 29 00:35:40 np0005539510 cloud-init[920]: |           . .   |
Nov 29 00:35:40 np0005539510 cloud-init[920]: |                 |
Nov 29 00:35:40 np0005539510 cloud-init[920]: |                 |
Nov 29 00:35:40 np0005539510 cloud-init[920]: +----[SHA256]-----+
Nov 29 00:35:40 np0005539510 systemd[1]: Finished Cloud-init: Network Stage.
Nov 29 00:35:40 np0005539510 systemd[1]: Reached target Cloud-config availability.
Nov 29 00:35:40 np0005539510 systemd[1]: Reached target Network is Online.
Nov 29 00:35:40 np0005539510 systemd[1]: Starting Cloud-init: Config Stage...
Nov 29 00:35:40 np0005539510 systemd[1]: Starting Crash recovery kernel arming...
Nov 29 00:35:40 np0005539510 systemd[1]: Starting Notify NFS peers of a restart...
Nov 29 00:35:40 np0005539510 systemd[1]: Starting System Logging Service...
Nov 29 00:35:40 np0005539510 sm-notify[1002]: Version 2.5.4 starting
Nov 29 00:35:40 np0005539510 systemd[1]: Starting OpenSSH server daemon...
Nov 29 00:35:40 np0005539510 systemd[1]: Starting Permit User Sessions...
Nov 29 00:35:40 np0005539510 systemd[1]: Started OpenSSH server daemon.
Nov 29 00:35:40 np0005539510 systemd[1]: Started Notify NFS peers of a restart.
Nov 29 00:35:40 np0005539510 systemd[1]: Finished Permit User Sessions.
Nov 29 00:35:40 np0005539510 systemd[1]: Started Command Scheduler.
Nov 29 00:35:40 np0005539510 systemd[1]: Started Getty on tty1.
Nov 29 00:35:40 np0005539510 systemd[1]: Started Serial Getty on ttyS0.
Nov 29 00:35:40 np0005539510 systemd[1]: Reached target Login Prompts.
Nov 29 00:35:40 np0005539510 rsyslogd[1003]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1003" x-info="https://www.rsyslog.com"] start
Nov 29 00:35:40 np0005539510 rsyslogd[1003]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Nov 29 00:35:40 np0005539510 systemd[1]: Started System Logging Service.
Nov 29 00:35:40 np0005539510 systemd[1]: Reached target Multi-User System.
Nov 29 00:35:40 np0005539510 systemd[1]: Starting Record Runlevel Change in UTMP...
Nov 29 00:35:40 np0005539510 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Nov 29 00:35:40 np0005539510 systemd[1]: Finished Record Runlevel Change in UTMP.
Nov 29 00:35:40 np0005539510 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 00:35:40 np0005539510 kdumpctl[1015]: kdump: No kdump initial ramdisk found.
Nov 29 00:35:40 np0005539510 kdumpctl[1015]: kdump: Rebuilding /boot/initramfs-5.14.0-642.el9.x86_64kdump.img
Nov 29 00:35:40 np0005539510 cloud-init[1076]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Sat, 29 Nov 2025 05:35:40 +0000. Up 9.39 seconds.
Nov 29 00:35:40 np0005539510 systemd[1]: Finished Cloud-init: Config Stage.
Nov 29 00:35:40 np0005539510 systemd[1]: Starting Cloud-init: Final Stage...
Nov 29 00:35:41 np0005539510 cloud-init[1280]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Sat, 29 Nov 2025 05:35:41 +0000. Up 9.81 seconds.
Nov 29 00:35:41 np0005539510 dracut[1284]: dracut-057-102.git20250818.el9
Nov 29 00:35:41 np0005539510 cloud-init[1301]: #############################################################
Nov 29 00:35:41 np0005539510 cloud-init[1302]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Nov 29 00:35:41 np0005539510 cloud-init[1304]: 256 SHA256:nG6jfywUu6AsvzEUvN5IQcVxqXbc/fw8Kbp0W8eiUhM root@np0005539510.novalocal (ECDSA)
Nov 29 00:35:41 np0005539510 cloud-init[1306]: 256 SHA256:mASfPWBs+dgCD81awWsnWl9vNcMu1QkamaXtCAzGVEw root@np0005539510.novalocal (ED25519)
Nov 29 00:35:41 np0005539510 cloud-init[1308]: 3072 SHA256:RCpG0KO3RkDTAwx6sae5aoVt8m1SH/n2QdTs/YFRGOM root@np0005539510.novalocal (RSA)
Nov 29 00:35:41 np0005539510 cloud-init[1309]: -----END SSH HOST KEY FINGERPRINTS-----
Nov 29 00:35:41 np0005539510 cloud-init[1310]: #############################################################
Nov 29 00:35:41 np0005539510 cloud-init[1280]: Cloud-init v. 24.4-7.el9 finished at Sat, 29 Nov 2025 05:35:41 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 10.00 seconds
Nov 29 00:35:41 np0005539510 systemd[1]: Finished Cloud-init: Final Stage.
Nov 29 00:35:41 np0005539510 systemd[1]: Reached target Cloud-init target.
Nov 29 00:35:41 np0005539510 dracut[1286]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-642.el9.x86_64kdump.img 5.14.0-642.el9.x86_64
Nov 29 00:35:41 np0005539510 dracut[1286]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Nov 29 00:35:41 np0005539510 dracut[1286]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Nov 29 00:35:41 np0005539510 dracut[1286]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Nov 29 00:35:41 np0005539510 dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 29 00:35:41 np0005539510 dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 29 00:35:41 np0005539510 dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 29 00:35:41 np0005539510 dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 29 00:35:41 np0005539510 dracut[1286]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: memstrack is not available
Nov 29 00:35:42 np0005539510 dracut[1286]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 29 00:35:42 np0005539510 dracut[1286]: memstrack is not available
Nov 29 00:35:42 np0005539510 dracut[1286]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 29 00:35:43 np0005539510 dracut[1286]: *** Including module: systemd ***
Nov 29 00:35:43 np0005539510 dracut[1286]: *** Including module: fips ***
Nov 29 00:35:43 np0005539510 dracut[1286]: *** Including module: systemd-initrd ***
Nov 29 00:35:43 np0005539510 chronyd[787]: Selected source 162.159.200.123 (2.centos.pool.ntp.org)
Nov 29 00:35:43 np0005539510 chronyd[787]: System clock TAI offset set to 37 seconds
Nov 29 00:35:43 np0005539510 dracut[1286]: *** Including module: i18n ***
Nov 29 00:35:43 np0005539510 dracut[1286]: *** Including module: drm ***
Nov 29 00:35:44 np0005539510 dracut[1286]: *** Including module: prefixdevname ***
Nov 29 00:35:44 np0005539510 dracut[1286]: *** Including module: kernel-modules ***
Nov 29 00:35:44 np0005539510 kernel: block vda: the capability attribute has been deprecated.
Nov 29 00:35:45 np0005539510 dracut[1286]: *** Including module: kernel-modules-extra ***
Nov 29 00:35:45 np0005539510 dracut[1286]: *** Including module: qemu ***
Nov 29 00:35:45 np0005539510 dracut[1286]: *** Including module: fstab-sys ***
Nov 29 00:35:45 np0005539510 dracut[1286]: *** Including module: rootfs-block ***
Nov 29 00:35:45 np0005539510 dracut[1286]: *** Including module: terminfo ***
Nov 29 00:35:45 np0005539510 dracut[1286]: *** Including module: udev-rules ***
Nov 29 00:35:46 np0005539510 dracut[1286]: Skipping udev rule: 91-permissions.rules
Nov 29 00:35:46 np0005539510 dracut[1286]: Skipping udev rule: 80-drivers-modprobe.rules
Nov 29 00:35:46 np0005539510 dracut[1286]: *** Including module: virtiofs ***
Nov 29 00:35:46 np0005539510 dracut[1286]: *** Including module: dracut-systemd ***
Nov 29 00:35:46 np0005539510 dracut[1286]: *** Including module: usrmount ***
Nov 29 00:35:46 np0005539510 dracut[1286]: *** Including module: base ***
Nov 29 00:35:46 np0005539510 dracut[1286]: *** Including module: fs-lib ***
Nov 29 00:35:46 np0005539510 dracut[1286]: *** Including module: kdumpbase ***
Nov 29 00:35:47 np0005539510 dracut[1286]: *** Including module: microcode_ctl-fw_dir_override ***
Nov 29 00:35:47 np0005539510 dracut[1286]:  microcode_ctl module: mangling fw_dir
Nov 29 00:35:47 np0005539510 dracut[1286]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Nov 29 00:35:47 np0005539510 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Nov 29 00:35:47 np0005539510 dracut[1286]:    microcode_ctl: configuration "intel" is ignored
Nov 29 00:35:47 np0005539510 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Nov 29 00:35:47 np0005539510 dracut[1286]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Nov 29 00:35:47 np0005539510 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Nov 29 00:35:47 np0005539510 dracut[1286]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Nov 29 00:35:47 np0005539510 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Nov 29 00:35:47 np0005539510 dracut[1286]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Nov 29 00:35:47 np0005539510 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Nov 29 00:35:47 np0005539510 dracut[1286]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Nov 29 00:35:47 np0005539510 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Nov 29 00:35:47 np0005539510 dracut[1286]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Nov 29 00:35:47 np0005539510 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Nov 29 00:35:47 np0005539510 dracut[1286]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Nov 29 00:35:47 np0005539510 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Nov 29 00:35:47 np0005539510 dracut[1286]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Nov 29 00:35:47 np0005539510 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Nov 29 00:35:47 np0005539510 dracut[1286]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Nov 29 00:35:47 np0005539510 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Nov 29 00:35:47 np0005539510 dracut[1286]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Nov 29 00:35:47 np0005539510 dracut[1286]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Nov 29 00:35:47 np0005539510 dracut[1286]: *** Including module: openssl ***
Nov 29 00:35:48 np0005539510 dracut[1286]: *** Including module: shutdown ***
Nov 29 00:35:48 np0005539510 dracut[1286]: *** Including module: squash ***
Nov 29 00:35:48 np0005539510 dracut[1286]: *** Including modules done ***
Nov 29 00:35:48 np0005539510 dracut[1286]: *** Installing kernel module dependencies ***
Nov 29 00:35:48 np0005539510 irqbalance[780]: Cannot change IRQ 25 affinity: Operation not permitted
Nov 29 00:35:48 np0005539510 irqbalance[780]: IRQ 25 affinity is now unmanaged
Nov 29 00:35:48 np0005539510 irqbalance[780]: Cannot change IRQ 31 affinity: Operation not permitted
Nov 29 00:35:48 np0005539510 irqbalance[780]: IRQ 31 affinity is now unmanaged
Nov 29 00:35:48 np0005539510 irqbalance[780]: Cannot change IRQ 28 affinity: Operation not permitted
Nov 29 00:35:48 np0005539510 irqbalance[780]: IRQ 28 affinity is now unmanaged
Nov 29 00:35:48 np0005539510 irqbalance[780]: Cannot change IRQ 32 affinity: Operation not permitted
Nov 29 00:35:48 np0005539510 irqbalance[780]: IRQ 32 affinity is now unmanaged
Nov 29 00:35:48 np0005539510 irqbalance[780]: Cannot change IRQ 30 affinity: Operation not permitted
Nov 29 00:35:48 np0005539510 irqbalance[780]: IRQ 30 affinity is now unmanaged
Nov 29 00:35:48 np0005539510 irqbalance[780]: Cannot change IRQ 29 affinity: Operation not permitted
Nov 29 00:35:48 np0005539510 irqbalance[780]: IRQ 29 affinity is now unmanaged
Nov 29 00:35:48 np0005539510 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 00:35:49 np0005539510 dracut[1286]: *** Installing kernel module dependencies done ***
Nov 29 00:35:49 np0005539510 dracut[1286]: *** Resolving executable dependencies ***
Nov 29 00:35:51 np0005539510 dracut[1286]: *** Resolving executable dependencies done ***
Nov 29 00:35:51 np0005539510 dracut[1286]: *** Generating early-microcode cpio image ***
Nov 29 00:35:51 np0005539510 dracut[1286]: *** Store current command line parameters ***
Nov 29 00:35:51 np0005539510 dracut[1286]: Stored kernel commandline:
Nov 29 00:35:51 np0005539510 dracut[1286]: No dracut internal kernel commandline stored in the initramfs
Nov 29 00:35:51 np0005539510 dracut[1286]: *** Install squash loader ***
Nov 29 00:35:52 np0005539510 dracut[1286]: *** Squashing the files inside the initramfs ***
Nov 29 00:35:53 np0005539510 dracut[1286]: *** Squashing the files inside the initramfs done ***
Nov 29 00:35:53 np0005539510 dracut[1286]: *** Creating image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' ***
Nov 29 00:35:53 np0005539510 dracut[1286]: *** Hardlinking files ***
Nov 29 00:35:53 np0005539510 dracut[1286]: *** Hardlinking files done ***
Nov 29 00:35:53 np0005539510 dracut[1286]: *** Creating initramfs image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' done ***
Nov 29 00:35:54 np0005539510 kdumpctl[1015]: kdump: kexec: loaded kdump kernel
Nov 29 00:35:54 np0005539510 kdumpctl[1015]: kdump: Starting kdump: [OK]
Nov 29 00:35:54 np0005539510 systemd[1]: Finished Crash recovery kernel arming.
Nov 29 00:35:54 np0005539510 systemd[1]: Startup finished in 1.667s (kernel) + 2.814s (initrd) + 18.523s (userspace) = 23.004s.
Nov 29 00:36:08 np0005539510 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 29 00:37:40 np0005539510 systemd[1]: Created slice User Slice of UID 1000.
Nov 29 00:37:40 np0005539510 systemd[1]: Starting User Runtime Directory /run/user/1000...
Nov 29 00:37:40 np0005539510 systemd-logind[784]: New session 1 of user zuul.
Nov 29 00:37:40 np0005539510 systemd[1]: Finished User Runtime Directory /run/user/1000.
Nov 29 00:37:40 np0005539510 systemd[1]: Starting User Manager for UID 1000...
Nov 29 00:37:40 np0005539510 systemd[4302]: Queued start job for default target Main User Target.
Nov 29 00:37:40 np0005539510 systemd[4302]: Created slice User Application Slice.
Nov 29 00:37:40 np0005539510 systemd[4302]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 00:37:40 np0005539510 systemd[4302]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 00:37:40 np0005539510 systemd[4302]: Reached target Paths.
Nov 29 00:37:40 np0005539510 systemd[4302]: Reached target Timers.
Nov 29 00:37:40 np0005539510 systemd[4302]: Starting D-Bus User Message Bus Socket...
Nov 29 00:37:40 np0005539510 systemd[4302]: Starting Create User's Volatile Files and Directories...
Nov 29 00:37:40 np0005539510 systemd[4302]: Finished Create User's Volatile Files and Directories.
Nov 29 00:37:40 np0005539510 systemd[4302]: Listening on D-Bus User Message Bus Socket.
Nov 29 00:37:40 np0005539510 systemd[4302]: Reached target Sockets.
Nov 29 00:37:40 np0005539510 systemd[4302]: Reached target Basic System.
Nov 29 00:37:40 np0005539510 systemd[4302]: Reached target Main User Target.
Nov 29 00:37:40 np0005539510 systemd[4302]: Startup finished in 117ms.
Nov 29 00:37:40 np0005539510 systemd[1]: Started User Manager for UID 1000.
Nov 29 00:37:40 np0005539510 systemd[1]: Started Session 1 of User zuul.
Nov 29 00:37:41 np0005539510 python3[4384]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 00:37:46 np0005539510 python3[4412]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 00:37:52 np0005539510 python3[4470]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 00:37:53 np0005539510 python3[4510]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Nov 29 00:37:55 np0005539510 python3[4536]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCrxzXgpPmVv8+7+5w1Oy1RsXOPeqdxTcUlq37d0RcYulAAKXWla/qJwAX46v5xh/Mg4GnRpk77lvDWcVnOQjFYQg3OeLmFgDDNPV0YL7URmIe/MvgcqM+Kx7/SQjk+hEt7rUIqkFUjeREX60T5eTEMANFgJrljqZcBTMgYr67x4v7oFELzKuZIO0SCAprJ9NYmdRaC+DsjZjU+DuFdHBnfZCpgkTFMCda2FAS9BneAVOIMCBu5RgNVJXeAgIsPX9GNX3qDJMKOluQLOW++2gbue3S1Nrs1GMPm+IPRD4yWc9eZs1tpR1jdP1BEPBpyQRQlUn4z7BUdEogSzYiXCSmqzN1o/R3mdi16bG8e2lHve5MQFABPko8KsgVOJu0H7b7wGo/oGdXH7sdlKuGoWxWyTFcq3RcVkaVgjKtt6zeswkrpxMUv9/6NXPrhIWqdQm/wVw0Pv2p98yq10QRPyBv5yI8zcNjxueUl3aM8SZML87E6lhkUFFdAuVof+Sl5Pz8= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:37:55 np0005539510 python3[4560]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:37:56 np0005539510 python3[4659]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:37:56 np0005539510 python3[4730]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764394676.172576-254-274465802037065/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=601e897125784122ba5d7472ada57b1d_id_rsa follow=False checksum=5ac8bea8bfb8f348688bf24843ddb1285b2d351d backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:37:57 np0005539510 python3[4853]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:37:57 np0005539510 python3[4924]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764394677.1864204-308-28332985242287/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=601e897125784122ba5d7472ada57b1d_id_rsa.pub follow=False checksum=48b31d706687f3385690285b8caeaea67ea8286c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:37:58 np0005539510 irqbalance[780]: Cannot change IRQ 27 affinity: Operation not permitted
Nov 29 00:37:58 np0005539510 irqbalance[780]: IRQ 27 affinity is now unmanaged
Nov 29 00:37:59 np0005539510 python3[4972]: ansible-ping Invoked with data=pong
Nov 29 00:38:00 np0005539510 python3[4996]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 00:38:02 np0005539510 python3[5054]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Nov 29 00:38:03 np0005539510 python3[5086]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:38:03 np0005539510 python3[5110]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:38:03 np0005539510 python3[5134]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:38:05 np0005539510 python3[5158]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:38:05 np0005539510 python3[5182]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:38:05 np0005539510 python3[5206]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:38:07 np0005539510 python3[5232]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:38:08 np0005539510 python3[5310]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:38:08 np0005539510 python3[5383]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764394687.5909472-34-181945937682816/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:38:09 np0005539510 python3[5431]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:09 np0005539510 python3[5455]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:09 np0005539510 python3[5479]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:10 np0005539510 python3[5503]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:10 np0005539510 python3[5527]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:10 np0005539510 python3[5551]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:10 np0005539510 python3[5575]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:11 np0005539510 python3[5599]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:11 np0005539510 python3[5623]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:11 np0005539510 python3[5647]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:12 np0005539510 python3[5671]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:12 np0005539510 python3[5695]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:12 np0005539510 python3[5719]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:13 np0005539510 python3[5743]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:13 np0005539510 python3[5767]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:13 np0005539510 python3[5791]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:13 np0005539510 python3[5815]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:14 np0005539510 python3[5839]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:14 np0005539510 python3[5863]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:14 np0005539510 python3[5887]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:14 np0005539510 python3[5911]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:15 np0005539510 python3[5935]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:15 np0005539510 python3[5959]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:15 np0005539510 python3[5983]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:15 np0005539510 python3[6007]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:16 np0005539510 python3[6031]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:38:19 np0005539510 python3[6057]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 29 00:38:19 np0005539510 systemd[1]: Starting Time & Date Service...
Nov 29 00:38:19 np0005539510 systemd[1]: Started Time & Date Service.
Nov 29 00:38:19 np0005539510 systemd-timedated[6059]: Changed time zone to 'UTC' (UTC).
Nov 29 00:38:19 np0005539510 python3[6088]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:38:20 np0005539510 python3[6164]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:38:20 np0005539510 python3[6235]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764394700.0409346-254-112154858688938/source _original_basename=tmpzfxmfq3f follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:38:21 np0005539510 python3[6335]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:38:21 np0005539510 python3[6406]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764394701.144427-304-50445399397328/source _original_basename=tmp21_6hg5b follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:38:22 np0005539510 python3[6508]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:38:22 np0005539510 python3[6581]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764394702.42867-384-118647596943086/source _original_basename=tmp9vticgkx follow=False checksum=de28d19618025176a7a65eba0e40c742fe7af9f4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:38:23 np0005539510 python3[6629]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:38:23 np0005539510 python3[6655]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:38:24 np0005539510 python3[6735]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:38:25 np0005539510 python3[6808]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764394704.4042313-454-225249419640179/source _original_basename=tmpugphsqpn follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:38:25 np0005539510 python3[6859]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-3d5b-5bb0-00000000001f-1-compute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:38:26 np0005539510 python3[6887]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-3d5b-5bb0-000000000020-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Nov 29 00:38:27 np0005539510 python3[6915]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:38:46 np0005539510 python3[6941]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:38:49 np0005539510 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 29 00:39:46 np0005539510 systemd-logind[784]: Session 1 logged out. Waiting for processes to exit.
Nov 29 00:40:17 np0005539510 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 29 00:40:17 np0005539510 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Nov 29 00:40:17 np0005539510 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Nov 29 00:40:17 np0005539510 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Nov 29 00:40:17 np0005539510 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Nov 29 00:40:17 np0005539510 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Nov 29 00:40:17 np0005539510 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Nov 29 00:40:17 np0005539510 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Nov 29 00:40:17 np0005539510 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Nov 29 00:40:17 np0005539510 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Nov 29 00:40:17 np0005539510 NetworkManager[857]: <info>  [1764394817.5536] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 29 00:40:17 np0005539510 systemd-udevd[6947]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 00:40:17 np0005539510 NetworkManager[857]: <info>  [1764394817.5730] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 00:40:17 np0005539510 NetworkManager[857]: <info>  [1764394817.5754] settings: (eth1): created default wired connection 'Wired connection 1'
Nov 29 00:40:17 np0005539510 NetworkManager[857]: <info>  [1764394817.5758] device (eth1): carrier: link connected
Nov 29 00:40:17 np0005539510 NetworkManager[857]: <info>  [1764394817.5760] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 29 00:40:17 np0005539510 NetworkManager[857]: <info>  [1764394817.5765] policy: auto-activating connection 'Wired connection 1' (30147632-9597-375e-a51b-e6c74b52332e)
Nov 29 00:40:17 np0005539510 NetworkManager[857]: <info>  [1764394817.5769] device (eth1): Activation: starting connection 'Wired connection 1' (30147632-9597-375e-a51b-e6c74b52332e)
Nov 29 00:40:17 np0005539510 NetworkManager[857]: <info>  [1764394817.5770] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 00:40:17 np0005539510 NetworkManager[857]: <info>  [1764394817.5774] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 00:40:17 np0005539510 NetworkManager[857]: <info>  [1764394817.5778] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 00:40:17 np0005539510 NetworkManager[857]: <info>  [1764394817.5782] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 29 00:40:17 np0005539510 systemd[4302]: Starting Mark boot as successful...
Nov 29 00:40:17 np0005539510 systemd[4302]: Finished Mark boot as successful.
Nov 29 00:40:18 np0005539510 systemd-logind[784]: New session 3 of user zuul.
Nov 29 00:40:18 np0005539510 systemd[1]: Started Session 3 of User zuul.
Nov 29 00:40:18 np0005539510 python3[6978]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-4e5a-44df-0000000001f6-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:40:28 np0005539510 python3[7060]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:40:29 np0005539510 python3[7133]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764394828.6393135-206-59077736288529/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=348c98c735136a2106546cb80073c0e23d947857 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:40:29 np0005539510 python3[7183]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 00:40:29 np0005539510 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 29 00:40:29 np0005539510 systemd[1]: Stopped Network Manager Wait Online.
Nov 29 00:40:29 np0005539510 systemd[1]: Stopping Network Manager Wait Online...
Nov 29 00:40:29 np0005539510 systemd[1]: Stopping Network Manager...
Nov 29 00:40:29 np0005539510 NetworkManager[857]: <info>  [1764394829.8138] caught SIGTERM, shutting down normally.
Nov 29 00:40:29 np0005539510 NetworkManager[857]: <info>  [1764394829.8148] dhcp4 (eth0): canceled DHCP transaction
Nov 29 00:40:29 np0005539510 NetworkManager[857]: <info>  [1764394829.8149] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 00:40:29 np0005539510 NetworkManager[857]: <info>  [1764394829.8149] dhcp4 (eth0): state changed no lease
Nov 29 00:40:29 np0005539510 NetworkManager[857]: <info>  [1764394829.8152] manager: NetworkManager state is now CONNECTING
Nov 29 00:40:29 np0005539510 NetworkManager[857]: <info>  [1764394829.8268] dhcp4 (eth1): canceled DHCP transaction
Nov 29 00:40:29 np0005539510 NetworkManager[857]: <info>  [1764394829.8270] dhcp4 (eth1): state changed no lease
Nov 29 00:40:29 np0005539510 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 00:40:29 np0005539510 NetworkManager[857]: <info>  [1764394829.8333] exiting (success)
Nov 29 00:40:29 np0005539510 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 00:40:29 np0005539510 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 29 00:40:29 np0005539510 systemd[1]: Stopped Network Manager.
Nov 29 00:40:29 np0005539510 systemd[1]: NetworkManager.service: Consumed 1.876s CPU time, 9.9M memory peak.
Nov 29 00:40:29 np0005539510 systemd[1]: Starting Network Manager...
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.8797] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:631d2949-c1d4-4f67-afc4-db082a3ff43a)
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.8798] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.8842] manager[0x564e29cae070]: monitoring kernel firmware directory '/lib/firmware'.
Nov 29 00:40:29 np0005539510 systemd[1]: Starting Hostname Service...
Nov 29 00:40:29 np0005539510 systemd[1]: Started Hostname Service.
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9532] hostname: hostname: using hostnamed
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9532] hostname: static hostname changed from (none) to "np0005539510.novalocal"
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9537] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9541] manager[0x564e29cae070]: rfkill: Wi-Fi hardware radio set enabled
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9541] manager[0x564e29cae070]: rfkill: WWAN hardware radio set enabled
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9563] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9564] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9564] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9565] manager: Networking is enabled by state file
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9566] settings: Loaded settings plugin: keyfile (internal)
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9570] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9589] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9595] dhcp: init: Using DHCP client 'internal'
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9597] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9600] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9604] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9610] device (lo): Activation: starting connection 'lo' (5b6e73d6-4c36-495c-9d49-56d866cbd8e2)
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9614] device (eth0): carrier: link connected
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9618] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9621] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9621] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9626] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9632] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9636] device (eth1): carrier: link connected
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9639] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9642] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (30147632-9597-375e-a51b-e6c74b52332e) (indicated)
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9643] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9646] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9651] device (eth1): Activation: starting connection 'Wired connection 1' (30147632-9597-375e-a51b-e6c74b52332e)
Nov 29 00:40:29 np0005539510 systemd[1]: Started Network Manager.
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9656] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9658] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9660] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9662] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9663] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9666] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9668] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9670] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9673] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9679] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9682] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9689] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9691] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9707] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9709] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9722] device (lo): Activation: successful, device activated.
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9731] dhcp4 (eth0): state changed new lease, address=38.102.83.94
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9737] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9790] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9807] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9809] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9813] manager: NetworkManager state is now CONNECTED_SITE
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9817] device (eth0): Activation: successful, device activated.
Nov 29 00:40:29 np0005539510 NetworkManager[7196]: <info>  [1764394829.9823] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 29 00:40:29 np0005539510 systemd[1]: Starting Network Manager Wait Online...
Nov 29 00:40:30 np0005539510 python3[7267]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-4e5a-44df-0000000000d3-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:40:40 np0005539510 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 00:40:59 np0005539510 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 29 00:41:15 np0005539510 NetworkManager[7196]: <info>  [1764394875.3449] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 29 00:41:15 np0005539510 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 00:41:15 np0005539510 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 00:41:15 np0005539510 NetworkManager[7196]: <info>  [1764394875.3753] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 29 00:41:15 np0005539510 NetworkManager[7196]: <info>  [1764394875.3755] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 29 00:41:15 np0005539510 NetworkManager[7196]: <info>  [1764394875.3762] device (eth1): Activation: successful, device activated.
Nov 29 00:41:15 np0005539510 NetworkManager[7196]: <info>  [1764394875.3768] manager: startup complete
Nov 29 00:41:15 np0005539510 NetworkManager[7196]: <info>  [1764394875.3771] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Nov 29 00:41:15 np0005539510 NetworkManager[7196]: <warn>  [1764394875.3776] device (eth1): Activation: failed for connection 'Wired connection 1'
Nov 29 00:41:15 np0005539510 NetworkManager[7196]: <info>  [1764394875.3782] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Nov 29 00:41:15 np0005539510 systemd[1]: Finished Network Manager Wait Online.
Nov 29 00:41:15 np0005539510 NetworkManager[7196]: <info>  [1764394875.3928] dhcp4 (eth1): canceled DHCP transaction
Nov 29 00:41:15 np0005539510 NetworkManager[7196]: <info>  [1764394875.3929] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 29 00:41:15 np0005539510 NetworkManager[7196]: <info>  [1764394875.3929] dhcp4 (eth1): state changed no lease
Nov 29 00:41:15 np0005539510 NetworkManager[7196]: <info>  [1764394875.3943] policy: auto-activating connection 'ci-private-network' (00e95469-28f7-5d90-a077-7f69916381bc)
Nov 29 00:41:15 np0005539510 NetworkManager[7196]: <info>  [1764394875.3947] device (eth1): Activation: starting connection 'ci-private-network' (00e95469-28f7-5d90-a077-7f69916381bc)
Nov 29 00:41:15 np0005539510 NetworkManager[7196]: <info>  [1764394875.3948] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 00:41:15 np0005539510 NetworkManager[7196]: <info>  [1764394875.3951] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 00:41:15 np0005539510 NetworkManager[7196]: <info>  [1764394875.3957] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 00:41:15 np0005539510 NetworkManager[7196]: <info>  [1764394875.3968] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 00:41:15 np0005539510 NetworkManager[7196]: <info>  [1764394875.4016] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 00:41:15 np0005539510 NetworkManager[7196]: <info>  [1764394875.4023] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 00:41:15 np0005539510 NetworkManager[7196]: <info>  [1764394875.4030] device (eth1): Activation: successful, device activated.
Nov 29 00:41:25 np0005539510 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 00:41:30 np0005539510 systemd[1]: session-3.scope: Deactivated successfully.
Nov 29 00:41:30 np0005539510 systemd[1]: session-3.scope: Consumed 1.415s CPU time.
Nov 29 00:41:30 np0005539510 systemd-logind[784]: Session 3 logged out. Waiting for processes to exit.
Nov 29 00:41:30 np0005539510 systemd-logind[784]: Removed session 3.
Nov 29 00:41:45 np0005539510 systemd-logind[784]: New session 4 of user zuul.
Nov 29 00:41:45 np0005539510 systemd[1]: Started Session 4 of User zuul.
Nov 29 00:41:46 np0005539510 python3[7381]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:41:46 np0005539510 python3[7454]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764394905.7018907-373-256392526151513/source _original_basename=tmp9heeay82 follow=False checksum=95c43167cb69fbe3f3b9eff0c3ecf63c2bbd5b70 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:41:48 np0005539510 systemd[1]: session-4.scope: Deactivated successfully.
Nov 29 00:41:48 np0005539510 systemd-logind[784]: Session 4 logged out. Waiting for processes to exit.
Nov 29 00:41:48 np0005539510 systemd-logind[784]: Removed session 4.
Nov 29 00:43:39 np0005539510 systemd[4302]: Created slice User Background Tasks Slice.
Nov 29 00:43:39 np0005539510 systemd[4302]: Starting Cleanup of User's Temporary Files and Directories...
Nov 29 00:43:39 np0005539510 systemd[4302]: Finished Cleanup of User's Temporary Files and Directories.
Nov 29 00:47:03 np0005539510 systemd-logind[784]: New session 5 of user zuul.
Nov 29 00:47:03 np0005539510 systemd[1]: Started Session 5 of User zuul.
Nov 29 00:47:03 np0005539510 python3[7516]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-b110-1686-000000000ca2-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:47:04 np0005539510 python3[7545]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:04 np0005539510 python3[7571]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:04 np0005539510 python3[7597]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:05 np0005539510 python3[7623]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:05 np0005539510 python3[7649]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:06 np0005539510 python3[7727]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:47:06 np0005539510 python3[7800]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764395225.882199-368-163779388183193/source _original_basename=tmpwwfcjsx3 follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:07 np0005539510 python3[7850]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 00:47:07 np0005539510 systemd[1]: Reloading.
Nov 29 00:47:07 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 00:47:09 np0005539510 python3[7905]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Nov 29 00:47:10 np0005539510 python3[7931]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:47:10 np0005539510 python3[7959]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:47:10 np0005539510 python3[7987]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:47:10 np0005539510 python3[8015]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:47:11 np0005539510 python3[8043]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-b110-1686-000000000ca9-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:47:12 np0005539510 python3[8073]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 29 00:47:15 np0005539510 systemd-logind[784]: Session 5 logged out. Waiting for processes to exit.
Nov 29 00:47:15 np0005539510 systemd[1]: session-5.scope: Deactivated successfully.
Nov 29 00:47:15 np0005539510 systemd[1]: session-5.scope: Consumed 4.398s CPU time.
Nov 29 00:47:15 np0005539510 systemd-logind[784]: Removed session 5.
Nov 29 00:47:16 np0005539510 systemd-logind[784]: New session 6 of user zuul.
Nov 29 00:47:16 np0005539510 systemd[1]: Started Session 6 of User zuul.
Nov 29 00:47:17 np0005539510 python3[8106]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 29 00:47:30 np0005539510 kernel: SELinux:  Converting 385 SID table entries...
Nov 29 00:47:30 np0005539510 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 00:47:30 np0005539510 kernel: SELinux:  policy capability open_perms=1
Nov 29 00:47:30 np0005539510 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 00:47:30 np0005539510 kernel: SELinux:  policy capability always_check_network=0
Nov 29 00:47:30 np0005539510 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 00:47:30 np0005539510 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 00:47:30 np0005539510 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 00:47:38 np0005539510 kernel: SELinux:  Converting 385 SID table entries...
Nov 29 00:47:38 np0005539510 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 00:47:38 np0005539510 kernel: SELinux:  policy capability open_perms=1
Nov 29 00:47:38 np0005539510 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 00:47:38 np0005539510 kernel: SELinux:  policy capability always_check_network=0
Nov 29 00:47:38 np0005539510 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 00:47:38 np0005539510 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 00:47:38 np0005539510 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 00:47:46 np0005539510 kernel: SELinux:  Converting 385 SID table entries...
Nov 29 00:47:46 np0005539510 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 00:47:46 np0005539510 kernel: SELinux:  policy capability open_perms=1
Nov 29 00:47:46 np0005539510 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 00:47:46 np0005539510 kernel: SELinux:  policy capability always_check_network=0
Nov 29 00:47:46 np0005539510 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 00:47:46 np0005539510 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 00:47:46 np0005539510 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 00:47:48 np0005539510 setsebool[8174]: The virt_use_nfs policy boolean was changed to 1 by root
Nov 29 00:47:48 np0005539510 setsebool[8174]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Nov 29 00:47:59 np0005539510 kernel: SELinux:  Converting 388 SID table entries...
Nov 29 00:47:59 np0005539510 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 00:47:59 np0005539510 kernel: SELinux:  policy capability open_perms=1
Nov 29 00:47:59 np0005539510 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 00:47:59 np0005539510 kernel: SELinux:  policy capability always_check_network=0
Nov 29 00:47:59 np0005539510 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 00:47:59 np0005539510 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 00:47:59 np0005539510 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 00:48:18 np0005539510 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 29 00:48:18 np0005539510 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 00:48:18 np0005539510 systemd[1]: Starting man-db-cache-update.service...
Nov 29 00:48:18 np0005539510 systemd[1]: Reloading.
Nov 29 00:48:18 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 00:48:18 np0005539510 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 00:48:24 np0005539510 python3[13473]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-4d52-d96a-00000000000c-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:48:25 np0005539510 kernel: evm: overlay not supported
Nov 29 00:48:25 np0005539510 systemd[4302]: Starting D-Bus User Message Bus...
Nov 29 00:48:25 np0005539510 dbus-broker-launch[14032]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Nov 29 00:48:25 np0005539510 dbus-broker-launch[14032]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Nov 29 00:48:25 np0005539510 systemd[4302]: Started D-Bus User Message Bus.
Nov 29 00:48:25 np0005539510 dbus-broker-lau[14032]: Ready
Nov 29 00:48:25 np0005539510 systemd[4302]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 29 00:48:25 np0005539510 systemd[4302]: Created slice Slice /user.
Nov 29 00:48:25 np0005539510 systemd[4302]: podman-13957.scope: unit configures an IP firewall, but not running as root.
Nov 29 00:48:25 np0005539510 systemd[4302]: (This warning is only shown for the first unit using IP firewalling.)
Nov 29 00:48:25 np0005539510 systemd[4302]: Started podman-13957.scope.
Nov 29 00:48:25 np0005539510 systemd[4302]: Started podman-pause-164fd77e.scope.
Nov 29 00:48:26 np0005539510 python3[14237]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.102.83.97:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.102.83.97:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:48:26 np0005539510 python3[14237]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Nov 29 00:48:26 np0005539510 systemd[1]: session-6.scope: Deactivated successfully.
Nov 29 00:48:26 np0005539510 systemd[1]: session-6.scope: Consumed 57.640s CPU time.
Nov 29 00:48:26 np0005539510 systemd-logind[784]: Session 6 logged out. Waiting for processes to exit.
Nov 29 00:48:26 np0005539510 systemd-logind[784]: Removed session 6.
Nov 29 00:48:52 np0005539510 systemd-logind[784]: New session 7 of user zuul.
Nov 29 00:48:52 np0005539510 systemd[1]: Started Session 7 of User zuul.
Nov 29 00:48:52 np0005539510 python3[23690]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEavs4NswnbtUkOvkddxZOa3c0S0nRNnsg86RQqSndpHonQx0HDlahei607KJa9VEo3VyPPhB6+AdHzrVqMc6KA= zuul@np0005539507.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:48:53 np0005539510 python3[23894]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEavs4NswnbtUkOvkddxZOa3c0S0nRNnsg86RQqSndpHonQx0HDlahei607KJa9VEo3VyPPhB6+AdHzrVqMc6KA= zuul@np0005539507.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:48:54 np0005539510 python3[24287]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005539510.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Nov 29 00:48:54 np0005539510 python3[24538]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEavs4NswnbtUkOvkddxZOa3c0S0nRNnsg86RQqSndpHonQx0HDlahei607KJa9VEo3VyPPhB6+AdHzrVqMc6KA= zuul@np0005539507.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:48:55 np0005539510 python3[24799]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:48:55 np0005539510 python3[25068]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764395335.0963047-170-205090840395830/source _original_basename=tmpd5cvm_7s follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:48:56 np0005539510 python3[25387]: ansible-ansible.builtin.hostname Invoked with name=compute-2 use=systemd
Nov 29 00:48:56 np0005539510 systemd[1]: Starting Hostname Service...
Nov 29 00:48:56 np0005539510 systemd[1]: Started Hostname Service.
Nov 29 00:48:56 np0005539510 systemd-hostnamed[25496]: Changed pretty hostname to 'compute-2'
Nov 29 00:48:57 np0005539510 systemd-hostnamed[25496]: Hostname set to <compute-2> (static)
Nov 29 00:48:57 np0005539510 NetworkManager[7196]: <info>  [1764395337.0010] hostname: static hostname changed from "np0005539510.novalocal" to "compute-2"
Nov 29 00:48:57 np0005539510 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 00:48:57 np0005539510 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 00:48:57 np0005539510 systemd[1]: session-7.scope: Deactivated successfully.
Nov 29 00:48:57 np0005539510 systemd[1]: session-7.scope: Consumed 2.360s CPU time.
Nov 29 00:48:57 np0005539510 systemd-logind[784]: Session 7 logged out. Waiting for processes to exit.
Nov 29 00:48:57 np0005539510 systemd-logind[784]: Removed session 7.
Nov 29 00:49:07 np0005539510 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 00:49:09 np0005539510 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 00:49:09 np0005539510 systemd[1]: Finished man-db-cache-update.service.
Nov 29 00:49:09 np0005539510 systemd[1]: man-db-cache-update.service: Consumed 1min 3.667s CPU time.
Nov 29 00:49:09 np0005539510 systemd[1]: run-r74f79890b8ab4be6abf78398fd034a1b.service: Deactivated successfully.
Nov 29 00:49:27 np0005539510 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 29 00:50:39 np0005539510 systemd[1]: Starting Cleanup of Temporary Directories...
Nov 29 00:50:39 np0005539510 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Nov 29 00:50:39 np0005539510 systemd[1]: Finished Cleanup of Temporary Directories.
Nov 29 00:50:39 np0005539510 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Nov 29 00:53:01 np0005539510 systemd-logind[784]: New session 8 of user zuul.
Nov 29 00:53:01 np0005539510 systemd[1]: Started Session 8 of User zuul.
Nov 29 00:53:02 np0005539510 python3[30015]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 00:53:04 np0005539510 python3[30131]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:53:04 np0005539510 python3[30204]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764395583.910924-34048-275634009079375/source mode=0755 _original_basename=delorean.repo follow=False checksum=a16f090252000d02a7f7d540bb10f7c1c9cd4ac5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:53:05 np0005539510 python3[30230]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:53:05 np0005539510 python3[30303]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764395583.910924-34048-275634009079375/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:53:05 np0005539510 python3[30329]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:53:06 np0005539510 python3[30402]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764395583.910924-34048-275634009079375/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:53:06 np0005539510 python3[30428]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:53:06 np0005539510 python3[30501]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764395583.910924-34048-275634009079375/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:53:07 np0005539510 python3[30527]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:53:07 np0005539510 python3[30600]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764395583.910924-34048-275634009079375/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:53:07 np0005539510 python3[30626]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:53:08 np0005539510 python3[30699]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764395583.910924-34048-275634009079375/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:53:08 np0005539510 python3[30725]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:53:08 np0005539510 python3[30798]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764395583.910924-34048-275634009079375/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=25e801a9a05537c191e2aa500f19076ac31d3e5b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:53:20 np0005539510 python3[30846]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:58:21 np0005539510 systemd-logind[784]: Session 8 logged out. Waiting for processes to exit.
Nov 29 00:58:21 np0005539510 systemd[1]: session-8.scope: Deactivated successfully.
Nov 29 00:58:21 np0005539510 systemd[1]: session-8.scope: Consumed 5.607s CPU time.
Nov 29 00:58:21 np0005539510 systemd-logind[784]: Removed session 8.
Nov 29 01:06:27 np0005539510 systemd-logind[784]: New session 9 of user zuul.
Nov 29 01:06:27 np0005539510 systemd[1]: Started Session 9 of User zuul.
Nov 29 01:06:28 np0005539510 python3.9[31035]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:06:29 np0005539510 python3.9[31216]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:06:38 np0005539510 systemd[1]: session-9.scope: Deactivated successfully.
Nov 29 01:06:38 np0005539510 systemd[1]: session-9.scope: Consumed 7.719s CPU time.
Nov 29 01:06:38 np0005539510 systemd-logind[784]: Session 9 logged out. Waiting for processes to exit.
Nov 29 01:06:38 np0005539510 systemd-logind[784]: Removed session 9.
Nov 29 01:06:54 np0005539510 systemd-logind[784]: New session 10 of user zuul.
Nov 29 01:06:54 np0005539510 systemd[1]: Started Session 10 of User zuul.
Nov 29 01:06:55 np0005539510 python3.9[31432]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 29 01:06:56 np0005539510 python3.9[31606]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:06:57 np0005539510 python3.9[31758]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:06:59 np0005539510 python3.9[31911]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:07:00 np0005539510 python3.9[32063]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:07:01 np0005539510 python3.9[32215]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:07:02 np0005539510 python3.9[32338]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396420.8783379-184-190824286794721/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:07:03 np0005539510 python3.9[32490]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:07:04 np0005539510 python3.9[32646]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:07:05 np0005539510 python3.9[32798]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:07:05 np0005539510 python3.9[32948]: ansible-ansible.builtin.service_facts Invoked
Nov 29 01:07:09 np0005539510 python3.9[33201]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:07:10 np0005539510 python3.9[33351]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:07:12 np0005539510 python3.9[33505]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:07:13 np0005539510 python3.9[33663]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:07:14 np0005539510 python3.9[33747]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:07:56 np0005539510 systemd[1]: Reloading.
Nov 29 01:07:56 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:07:56 np0005539510 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Nov 29 01:07:56 np0005539510 systemd[1]: Reloading.
Nov 29 01:07:56 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:07:57 np0005539510 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Nov 29 01:07:57 np0005539510 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Nov 29 01:07:57 np0005539510 systemd[1]: Reloading.
Nov 29 01:07:57 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:07:57 np0005539510 systemd[1]: Starting dnf makecache...
Nov 29 01:07:57 np0005539510 systemd[1]: Listening on LVM2 poll daemon socket.
Nov 29 01:07:57 np0005539510 dnf[34032]: Failed determining last makecache time.
Nov 29 01:07:57 np0005539510 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Nov 29 01:07:57 np0005539510 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Nov 29 01:07:57 np0005539510 dnf[34032]: delorean-openstack-barbican-42b4c41831408a8e323 155 kB/s | 3.0 kB     00:00
Nov 29 01:07:57 np0005539510 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Nov 29 01:07:57 np0005539510 dnf[34032]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 188 kB/s | 3.0 kB     00:00
Nov 29 01:07:57 np0005539510 dnf[34032]: delorean-openstack-cinder-1c00d6490d88e436f26ef 190 kB/s | 3.0 kB     00:00
Nov 29 01:07:57 np0005539510 dnf[34032]: delorean-python-stevedore-c4acc5639fd2329372142 190 kB/s | 3.0 kB     00:00
Nov 29 01:07:57 np0005539510 dnf[34032]: delorean-python-cloudkitty-tests-tempest-2c80f8 197 kB/s | 3.0 kB     00:00
Nov 29 01:07:57 np0005539510 dnf[34032]: delorean-os-net-config-9758ab42364673d01bc5014e 193 kB/s | 3.0 kB     00:00
Nov 29 01:07:57 np0005539510 dnf[34032]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 207 kB/s | 3.0 kB     00:00
Nov 29 01:07:57 np0005539510 dnf[34032]: delorean-python-designate-tests-tempest-347fdbc 208 kB/s | 3.0 kB     00:00
Nov 29 01:07:57 np0005539510 dnf[34032]: delorean-openstack-glance-1fd12c29b339f30fe823e 189 kB/s | 3.0 kB     00:00
Nov 29 01:07:57 np0005539510 dnf[34032]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 185 kB/s | 3.0 kB     00:00
Nov 29 01:07:57 np0005539510 dnf[34032]: delorean-openstack-manila-3c01b7181572c95dac462 186 kB/s | 3.0 kB     00:00
Nov 29 01:07:57 np0005539510 dnf[34032]: delorean-python-whitebox-neutron-tests-tempest- 191 kB/s | 3.0 kB     00:00
Nov 29 01:07:57 np0005539510 dnf[34032]: delorean-openstack-octavia-ba397f07a7331190208c 183 kB/s | 3.0 kB     00:00
Nov 29 01:07:57 np0005539510 dnf[34032]: delorean-openstack-watcher-c014f81a8647287f6dcc 194 kB/s | 3.0 kB     00:00
Nov 29 01:07:57 np0005539510 dnf[34032]: delorean-python-tcib-1124124ec06aadbac34f0d340b 175 kB/s | 3.0 kB     00:00
Nov 29 01:07:57 np0005539510 dnf[34032]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 161 kB/s | 3.0 kB     00:00
Nov 29 01:07:57 np0005539510 dnf[34032]: delorean-openstack-swift-dc98a8463506ac520c469a 161 kB/s | 3.0 kB     00:00
Nov 29 01:07:57 np0005539510 dnf[34032]: delorean-python-tempestconf-8515371b7cceebd4282 173 kB/s | 3.0 kB     00:00
Nov 29 01:07:57 np0005539510 dnf[34032]: delorean-openstack-heat-ui-013accbfd179753bc3f0 181 kB/s | 3.0 kB     00:00
Nov 29 01:07:57 np0005539510 dnf[34032]: CentOS Stream 9 - BaseOS                         79 kB/s | 7.3 kB     00:00
Nov 29 01:07:58 np0005539510 dnf[34032]: CentOS Stream 9 - AppStream                      32 kB/s | 7.4 kB     00:00
Nov 29 01:07:58 np0005539510 dnf[34032]: CentOS Stream 9 - CRB                            31 kB/s | 7.2 kB     00:00
Nov 29 01:07:58 np0005539510 dnf[34032]: CentOS Stream 9 - Extras packages                74 kB/s | 8.3 kB     00:00
Nov 29 01:07:58 np0005539510 dnf[34032]: dlrn-antelope-testing                           138 kB/s | 3.0 kB     00:00
Nov 29 01:07:58 np0005539510 dnf[34032]: dlrn-antelope-build-deps                        135 kB/s | 3.0 kB     00:00
Nov 29 01:07:58 np0005539510 dnf[34032]: centos9-rabbitmq                                107 kB/s | 3.0 kB     00:00
Nov 29 01:07:58 np0005539510 dnf[34032]: centos9-storage                                 116 kB/s | 3.0 kB     00:00
Nov 29 01:07:58 np0005539510 dnf[34032]: centos9-opstools                                142 kB/s | 3.0 kB     00:00
Nov 29 01:07:58 np0005539510 dnf[34032]: NFV SIG OpenvSwitch                             150 kB/s | 3.0 kB     00:00
Nov 29 01:07:58 np0005539510 dnf[34032]: repo-setup-centos-appstream                     192 kB/s | 4.4 kB     00:00
Nov 29 01:07:59 np0005539510 dnf[34032]: repo-setup-centos-baseos                         97 kB/s | 3.9 kB     00:00
Nov 29 01:07:59 np0005539510 dnf[34032]: repo-setup-centos-highavailability              183 kB/s | 3.9 kB     00:00
Nov 29 01:07:59 np0005539510 dnf[34032]: repo-setup-centos-powertools                    193 kB/s | 4.3 kB     00:00
Nov 29 01:07:59 np0005539510 dnf[34032]: Extra Packages for Enterprise Linux 9 - x86_64  109 kB/s |  33 kB     00:00
Nov 29 01:08:00 np0005539510 dnf[34032]: Metadata cache created.
Nov 29 01:08:00 np0005539510 systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 29 01:08:00 np0005539510 systemd[1]: Finished dnf makecache.
Nov 29 01:08:00 np0005539510 systemd[1]: dnf-makecache.service: Consumed 1.854s CPU time.
Nov 29 01:09:06 np0005539510 kernel: SELinux:  Converting 2718 SID table entries...
Nov 29 01:09:06 np0005539510 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 01:09:06 np0005539510 kernel: SELinux:  policy capability open_perms=1
Nov 29 01:09:06 np0005539510 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 01:09:06 np0005539510 kernel: SELinux:  policy capability always_check_network=0
Nov 29 01:09:06 np0005539510 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 01:09:06 np0005539510 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 01:09:06 np0005539510 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 01:09:07 np0005539510 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Nov 29 01:09:07 np0005539510 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 01:09:07 np0005539510 systemd[1]: Starting man-db-cache-update.service...
Nov 29 01:09:07 np0005539510 systemd[1]: Reloading.
Nov 29 01:09:08 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:09:08 np0005539510 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 01:09:11 np0005539510 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 01:09:11 np0005539510 systemd[1]: Finished man-db-cache-update.service.
Nov 29 01:09:11 np0005539510 systemd[1]: man-db-cache-update.service: Consumed 1.506s CPU time.
Nov 29 01:09:11 np0005539510 systemd[1]: run-re5c9407fe725444398094d54ef3c8658.service: Deactivated successfully.
Nov 29 01:09:18 np0005539510 python3.9[35343]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:09:21 np0005539510 python3.9[35624]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Nov 29 01:09:22 np0005539510 python3.9[35776]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Nov 29 01:09:25 np0005539510 python3.9[35929]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:09:27 np0005539510 python3.9[36081]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Nov 29 01:09:32 np0005539510 python3.9[36233]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:09:35 np0005539510 python3.9[36385]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:09:35 np0005539510 python3.9[36508]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396572.4166644-673-168308001233907/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3385b01217fece5877d0a0cc7f45f60761b1d6d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:09:38 np0005539510 python3.9[36660]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:09:39 np0005539510 python3.9[36812]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:09:40 np0005539510 python3.9[36965]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:09:41 np0005539510 python3.9[37117]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Nov 29 01:09:41 np0005539510 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:09:43 np0005539510 python3.9[37271]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 01:09:44 np0005539510 python3.9[37429]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 29 01:09:45 np0005539510 python3.9[37589]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Nov 29 01:09:46 np0005539510 python3.9[37742]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 01:09:47 np0005539510 python3.9[37900]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Nov 29 01:09:48 np0005539510 python3.9[38052]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:09:51 np0005539510 python3.9[38205]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:09:52 np0005539510 python3.9[38357]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:09:52 np0005539510 python3.9[38480]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764396591.5275009-1031-149692483657493/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:09:53 np0005539510 python3.9[38632]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:09:53 np0005539510 systemd[1]: Starting Load Kernel Modules...
Nov 29 01:09:53 np0005539510 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Nov 29 01:09:53 np0005539510 kernel: Bridge firewalling registered
Nov 29 01:09:53 np0005539510 systemd-modules-load[38636]: Inserted module 'br_netfilter'
Nov 29 01:09:53 np0005539510 systemd[1]: Finished Load Kernel Modules.
Nov 29 01:09:55 np0005539510 python3.9[38792]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:09:55 np0005539510 python3.9[38915]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764396594.6192741-1100-201496844374221/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:09:56 np0005539510 python3.9[39067]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:10:00 np0005539510 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Nov 29 01:10:00 np0005539510 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Nov 29 01:10:01 np0005539510 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 01:10:01 np0005539510 systemd[1]: Starting man-db-cache-update.service...
Nov 29 01:10:01 np0005539510 systemd[1]: Reloading.
Nov 29 01:10:01 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:10:01 np0005539510 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 01:10:04 np0005539510 python3.9[42269]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:10:05 np0005539510 python3.9[42956]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 29 01:10:05 np0005539510 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 01:10:05 np0005539510 systemd[1]: Finished man-db-cache-update.service.
Nov 29 01:10:05 np0005539510 systemd[1]: man-db-cache-update.service: Consumed 5.617s CPU time.
Nov 29 01:10:05 np0005539510 systemd[1]: run-r02a5de2f542640ad8d211521bd77735b.service: Deactivated successfully.
Nov 29 01:10:06 np0005539510 python3.9[43107]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:10:07 np0005539510 python3.9[43259]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:10:07 np0005539510 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 29 01:10:08 np0005539510 systemd[1]: Starting Authorization Manager...
Nov 29 01:10:08 np0005539510 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 29 01:10:08 np0005539510 polkitd[43476]: Started polkitd version 0.117
Nov 29 01:10:08 np0005539510 systemd[1]: Started Authorization Manager.
Nov 29 01:10:09 np0005539510 python3.9[43646]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:10:09 np0005539510 systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 29 01:10:09 np0005539510 systemd[1]: tuned.service: Deactivated successfully.
Nov 29 01:10:09 np0005539510 systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 29 01:10:09 np0005539510 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 29 01:10:09 np0005539510 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 29 01:10:10 np0005539510 python3.9[43807]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 29 01:10:14 np0005539510 python3.9[43959]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:10:14 np0005539510 systemd[1]: Reloading.
Nov 29 01:10:14 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:10:15 np0005539510 python3.9[44147]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:10:15 np0005539510 systemd[1]: Reloading.
Nov 29 01:10:15 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:10:17 np0005539510 python3.9[44335]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:10:17 np0005539510 python3.9[44488]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:10:17 np0005539510 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Nov 29 01:10:18 np0005539510 python3.9[44641]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:10:21 np0005539510 python3.9[44803]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:10:22 np0005539510 python3.9[44956]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:10:22 np0005539510 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 29 01:10:22 np0005539510 systemd[1]: Stopped Apply Kernel Variables.
Nov 29 01:10:22 np0005539510 systemd[1]: Stopping Apply Kernel Variables...
Nov 29 01:10:22 np0005539510 systemd[1]: Starting Apply Kernel Variables...
Nov 29 01:10:22 np0005539510 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 29 01:10:22 np0005539510 systemd[1]: Finished Apply Kernel Variables.
Nov 29 01:10:22 np0005539510 systemd[1]: session-10.scope: Deactivated successfully.
Nov 29 01:10:22 np0005539510 systemd[1]: session-10.scope: Consumed 2min 17.185s CPU time.
Nov 29 01:10:22 np0005539510 systemd-logind[784]: Session 10 logged out. Waiting for processes to exit.
Nov 29 01:10:22 np0005539510 systemd-logind[784]: Removed session 10.
Nov 29 01:10:28 np0005539510 systemd-logind[784]: New session 11 of user zuul.
Nov 29 01:10:29 np0005539510 systemd[1]: Started Session 11 of User zuul.
Nov 29 01:10:30 np0005539510 python3.9[45139]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:10:31 np0005539510 python3.9[45295]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Nov 29 01:10:32 np0005539510 python3.9[45448]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 01:10:34 np0005539510 python3.9[45606]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 29 01:10:35 np0005539510 python3.9[45766]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:10:36 np0005539510 python3.9[45850]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 01:10:40 np0005539510 python3.9[46015]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:10:51 np0005539510 kernel: SELinux:  Converting 2730 SID table entries...
Nov 29 01:10:51 np0005539510 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 01:10:51 np0005539510 kernel: SELinux:  policy capability open_perms=1
Nov 29 01:10:51 np0005539510 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 01:10:51 np0005539510 kernel: SELinux:  policy capability always_check_network=0
Nov 29 01:10:51 np0005539510 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 01:10:51 np0005539510 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 01:10:51 np0005539510 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 01:10:52 np0005539510 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Nov 29 01:10:52 np0005539510 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Nov 29 01:10:53 np0005539510 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 01:10:53 np0005539510 systemd[1]: Starting man-db-cache-update.service...
Nov 29 01:10:53 np0005539510 systemd[1]: Reloading.
Nov 29 01:10:53 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:10:53 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:10:53 np0005539510 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 01:10:54 np0005539510 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 01:10:54 np0005539510 systemd[1]: Finished man-db-cache-update.service.
Nov 29 01:10:54 np0005539510 systemd[1]: run-ra3d75904055f4de0b1e2a4218b562bf6.service: Deactivated successfully.
Nov 29 01:10:59 np0005539510 python3.9[47112]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:10:59 np0005539510 systemd[1]: Reloading.
Nov 29 01:10:59 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:10:59 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:10:59 np0005539510 systemd[1]: Starting Open vSwitch Database Unit...
Nov 29 01:10:59 np0005539510 chown[47155]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Nov 29 01:10:59 np0005539510 ovs-ctl[47160]: /etc/openvswitch/conf.db does not exist ... (warning).
Nov 29 01:10:59 np0005539510 ovs-ctl[47160]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Nov 29 01:10:59 np0005539510 ovs-ctl[47160]: Starting ovsdb-server [  OK  ]
Nov 29 01:10:59 np0005539510 ovs-vsctl[47209]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Nov 29 01:10:59 np0005539510 ovs-vsctl[47229]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"fa6f2e5a-176a-4b37-8b2a-5aaf74119c47\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Nov 29 01:10:59 np0005539510 ovs-ctl[47160]: Configuring Open vSwitch system IDs [  OK  ]
Nov 29 01:10:59 np0005539510 ovs-ctl[47160]: Enabling remote OVSDB managers [  OK  ]
Nov 29 01:10:59 np0005539510 systemd[1]: Started Open vSwitch Database Unit.
Nov 29 01:10:59 np0005539510 ovs-vsctl[47235]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Nov 29 01:11:00 np0005539510 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Nov 29 01:11:00 np0005539510 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Nov 29 01:11:00 np0005539510 systemd[1]: Starting Open vSwitch Forwarding Unit...
Nov 29 01:11:00 np0005539510 kernel: openvswitch: Open vSwitch switching datapath
Nov 29 01:11:00 np0005539510 ovs-ctl[47279]: Inserting openvswitch module [  OK  ]
Nov 29 01:11:00 np0005539510 ovs-ctl[47248]: Starting ovs-vswitchd [  OK  ]
Nov 29 01:11:00 np0005539510 ovs-vsctl[47296]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Nov 29 01:11:00 np0005539510 ovs-ctl[47248]: Enabling remote OVSDB managers [  OK  ]
Nov 29 01:11:00 np0005539510 systemd[1]: Started Open vSwitch Forwarding Unit.
Nov 29 01:11:00 np0005539510 systemd[1]: Starting Open vSwitch...
Nov 29 01:11:00 np0005539510 systemd[1]: Finished Open vSwitch.
Nov 29 01:11:01 np0005539510 python3.9[47448]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:11:02 np0005539510 python3.9[47600]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Nov 29 01:11:03 np0005539510 kernel: SELinux:  Converting 2744 SID table entries...
Nov 29 01:11:03 np0005539510 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 01:11:03 np0005539510 kernel: SELinux:  policy capability open_perms=1
Nov 29 01:11:03 np0005539510 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 01:11:03 np0005539510 kernel: SELinux:  policy capability always_check_network=0
Nov 29 01:11:03 np0005539510 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 01:11:03 np0005539510 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 01:11:03 np0005539510 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 01:11:04 np0005539510 python3.9[47755]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:11:05 np0005539510 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Nov 29 01:11:06 np0005539510 python3.9[47913]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:11:08 np0005539510 python3.9[48066]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:11:10 np0005539510 python3.9[48353]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 29 01:11:11 np0005539510 python3.9[48503]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:11:12 np0005539510 python3.9[48657]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:11:14 np0005539510 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 01:11:14 np0005539510 systemd[1]: Starting man-db-cache-update.service...
Nov 29 01:11:14 np0005539510 systemd[1]: Reloading.
Nov 29 01:11:14 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:11:14 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:11:14 np0005539510 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 01:11:14 np0005539510 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 01:11:14 np0005539510 systemd[1]: Finished man-db-cache-update.service.
Nov 29 01:11:14 np0005539510 systemd[1]: run-r7a764ade846c4b7b8f2e22f306461885.service: Deactivated successfully.
Nov 29 01:11:15 np0005539510 python3.9[48974]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:11:16 np0005539510 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 29 01:11:16 np0005539510 systemd[1]: Stopped Network Manager Wait Online.
Nov 29 01:11:16 np0005539510 systemd[1]: Stopping Network Manager Wait Online...
Nov 29 01:11:16 np0005539510 systemd[1]: Stopping Network Manager...
Nov 29 01:11:16 np0005539510 NetworkManager[7196]: <info>  [1764396676.7333] caught SIGTERM, shutting down normally.
Nov 29 01:11:16 np0005539510 NetworkManager[7196]: <info>  [1764396676.7359] dhcp4 (eth0): canceled DHCP transaction
Nov 29 01:11:16 np0005539510 NetworkManager[7196]: <info>  [1764396676.7361] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 01:11:16 np0005539510 NetworkManager[7196]: <info>  [1764396676.7362] dhcp4 (eth0): state changed no lease
Nov 29 01:11:16 np0005539510 NetworkManager[7196]: <info>  [1764396676.7366] manager: NetworkManager state is now CONNECTED_SITE
Nov 29 01:11:16 np0005539510 NetworkManager[7196]: <info>  [1764396676.7571] exiting (success)
Nov 29 01:11:16 np0005539510 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 01:11:16 np0005539510 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 01:11:16 np0005539510 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 29 01:11:16 np0005539510 systemd[1]: Stopped Network Manager.
Nov 29 01:11:16 np0005539510 systemd[1]: NetworkManager.service: Consumed 12.300s CPU time, 4.1M memory peak, read 0B from disk, written 38.0K to disk.
Nov 29 01:11:16 np0005539510 systemd[1]: Starting Network Manager...
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.8199] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:631d2949-c1d4-4f67-afc4-db082a3ff43a)
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.8200] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.8252] manager[0x55a932344090]: monitoring kernel firmware directory '/lib/firmware'.
Nov 29 01:11:16 np0005539510 systemd[1]: Starting Hostname Service...
Nov 29 01:11:16 np0005539510 systemd[1]: Started Hostname Service.
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9340] hostname: hostname: using hostnamed
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9342] hostname: static hostname changed from (none) to "compute-2"
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9346] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9350] manager[0x55a932344090]: rfkill: Wi-Fi hardware radio set enabled
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9350] manager[0x55a932344090]: rfkill: WWAN hardware radio set enabled
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9370] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9377] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9378] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9378] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9379] manager: Networking is enabled by state file
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9380] settings: Loaded settings plugin: keyfile (internal)
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9383] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9401] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9407] dhcp: init: Using DHCP client 'internal'
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9410] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9413] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9417] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9422] device (lo): Activation: starting connection 'lo' (5b6e73d6-4c36-495c-9d49-56d866cbd8e2)
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9426] device (eth0): carrier: link connected
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9429] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9432] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9433] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9436] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9441] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9444] device (eth1): carrier: link connected
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9447] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9450] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (00e95469-28f7-5d90-a077-7f69916381bc) (indicated)
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9451] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9454] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9459] device (eth1): Activation: starting connection 'ci-private-network' (00e95469-28f7-5d90-a077-7f69916381bc)
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9464] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9469] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9470] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9471] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 29 01:11:16 np0005539510 systemd[1]: Started Network Manager.
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9473] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9474] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9476] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9477] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9479] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9484] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9486] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9507] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9530] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9545] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9550] dhcp4 (eth0): state changed new lease, address=38.102.83.94
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9554] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9564] device (lo): Activation: successful, device activated.
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9581] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 29 01:11:16 np0005539510 systemd[1]: Starting Network Manager Wait Online...
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9662] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9674] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9677] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9684] manager: NetworkManager state is now CONNECTED_LOCAL
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9689] device (eth1): Activation: successful, device activated.
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9704] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9707] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9713] manager: NetworkManager state is now CONNECTED_SITE
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9719] device (eth0): Activation: successful, device activated.
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9726] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 29 01:11:16 np0005539510 NetworkManager[48989]: <info>  [1764396676.9732] manager: startup complete
Nov 29 01:11:16 np0005539510 systemd[1]: Finished Network Manager Wait Online.
Nov 29 01:11:17 np0005539510 python3.9[49203]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:11:26 np0005539510 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 01:11:26 np0005539510 systemd[1]: Starting man-db-cache-update.service...
Nov 29 01:11:26 np0005539510 systemd[1]: Reloading.
Nov 29 01:11:26 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:11:26 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:11:26 np0005539510 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 01:11:27 np0005539510 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 01:11:29 np0005539510 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 01:11:29 np0005539510 systemd[1]: Finished man-db-cache-update.service.
Nov 29 01:11:29 np0005539510 systemd[1]: run-rcc5793aeba7e41e3bbf790d3a6c3aec5.service: Deactivated successfully.
Nov 29 01:11:30 np0005539510 python3.9[49662]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:11:31 np0005539510 python3.9[49814]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:11:32 np0005539510 python3.9[49968]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:11:32 np0005539510 python3.9[50120]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:11:33 np0005539510 python3.9[50272]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:11:34 np0005539510 python3.9[50424]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:11:35 np0005539510 python3.9[50576]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:11:36 np0005539510 python3.9[50699]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396694.7573783-654-138020752444616/.source _original_basename=.6f2xn2e9 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:11:36 np0005539510 python3.9[50851]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:11:37 np0005539510 python3.9[51003]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Nov 29 01:11:38 np0005539510 python3.9[51155]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:11:41 np0005539510 python3.9[51582]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Nov 29 01:11:42 np0005539510 ansible-async_wrapper.py[51757]: Invoked with j690250472157 300 /home/zuul/.ansible/tmp/ansible-tmp-1764396701.787739-852-56175461338808/AnsiballZ_edpm_os_net_config.py _
Nov 29 01:11:42 np0005539510 ansible-async_wrapper.py[51760]: Starting module and watcher
Nov 29 01:11:42 np0005539510 ansible-async_wrapper.py[51760]: Start watching 51761 (300)
Nov 29 01:11:42 np0005539510 ansible-async_wrapper.py[51761]: Start module (51761)
Nov 29 01:11:42 np0005539510 ansible-async_wrapper.py[51757]: Return async_wrapper task started.
Nov 29 01:11:42 np0005539510 python3.9[51762]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Nov 29 01:11:43 np0005539510 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Nov 29 01:11:43 np0005539510 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Nov 29 01:11:43 np0005539510 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Nov 29 01:11:43 np0005539510 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Nov 29 01:11:43 np0005539510 kernel: cfg80211: failed to load regulatory.db
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.0638] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51763 uid=0 result="success"
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.0666] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51763 uid=0 result="success"
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1482] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1489] audit: op="connection-add" uuid="ebfe54cf-96d3-49e4-b61e-3677a9d0560c" name="br-ex-br" pid=51763 uid=0 result="success"
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1501] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1503] audit: op="connection-add" uuid="017be005-bfda-476b-a9ba-d18ac711f909" name="br-ex-port" pid=51763 uid=0 result="success"
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1515] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1518] audit: op="connection-add" uuid="1085709c-fa82-42c1-b9a3-9116d2c9c85c" name="eth1-port" pid=51763 uid=0 result="success"
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1530] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1532] audit: op="connection-add" uuid="56a3b9e0-a8cf-4243-8dbe-779deca6da4e" name="vlan20-port" pid=51763 uid=0 result="success"
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1544] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1546] audit: op="connection-add" uuid="f84d3f1b-fa78-4ec7-a5e6-a9c478291891" name="vlan21-port" pid=51763 uid=0 result="success"
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1557] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1560] audit: op="connection-add" uuid="34370b0b-4ab8-4813-96eb-6607120b8615" name="vlan22-port" pid=51763 uid=0 result="success"
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1572] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1574] audit: op="connection-add" uuid="8bffdbc6-597a-42b5-85f1-65525bb0f7fd" name="vlan23-port" pid=51763 uid=0 result="success"
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1593] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="802-3-ethernet.mtu,ipv4.dhcp-timeout,ipv4.dhcp-client-id,connection.autoconnect-priority,connection.timestamp,ipv6.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode" pid=51763 uid=0 result="success"
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1609] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1612] audit: op="connection-add" uuid="f1807500-9cc7-403f-ba10-db9bacdae9f1" name="br-ex-if" pid=51763 uid=0 result="success"
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1661] audit: op="connection-update" uuid="00e95469-28f7-5d90-a077-7f69916381bc" name="ci-private-network" args="ovs-interface.type,ipv4.dns,ipv4.never-default,ipv4.addresses,ipv4.method,ipv4.routes,ipv4.routing-rules,connection.controller,connection.port-type,connection.slave-type,connection.timestamp,connection.master,ipv6.dns,ipv6.addresses,ipv6.method,ipv6.routes,ipv6.addr-gen-mode,ipv6.routing-rules,ovs-external-ids.data" pid=51763 uid=0 result="success"
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1676] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1679] audit: op="connection-add" uuid="2453c22d-6f61-42e5-85d9-d536640dda9d" name="vlan20-if" pid=51763 uid=0 result="success"
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1693] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1696] audit: op="connection-add" uuid="2a7b0e3f-5d6a-4918-ac4e-9d3dc2d30b6e" name="vlan21-if" pid=51763 uid=0 result="success"
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1712] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1715] audit: op="connection-add" uuid="63311777-9903-4357-845b-90dd8bfaf872" name="vlan22-if" pid=51763 uid=0 result="success"
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1731] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1733] audit: op="connection-add" uuid="4526e221-80ce-4766-b424-c6e5fc20bae4" name="vlan23-if" pid=51763 uid=0 result="success"
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1746] audit: op="connection-delete" uuid="30147632-9597-375e-a51b-e6c74b52332e" name="Wired connection 1" pid=51763 uid=0 result="success"
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1758] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1771] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1777] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (ebfe54cf-96d3-49e4-b61e-3677a9d0560c)
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1779] audit: op="connection-activate" uuid="ebfe54cf-96d3-49e4-b61e-3677a9d0560c" name="br-ex-br" pid=51763 uid=0 result="success"
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1782] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1792] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1797] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (017be005-bfda-476b-a9ba-d18ac711f909)
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1800] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1809] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1815] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (1085709c-fa82-42c1-b9a3-9116d2c9c85c)
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1817] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1827] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1832] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (56a3b9e0-a8cf-4243-8dbe-779deca6da4e)
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1835] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1844] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1850] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (f84d3f1b-fa78-4ec7-a5e6-a9c478291891)
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1853] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1862] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1868] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (34370b0b-4ab8-4813-96eb-6607120b8615)
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1870] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1880] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1886] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (8bffdbc6-597a-42b5-85f1-65525bb0f7fd)
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1888] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1892] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1895] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1902] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1909] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1915] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (f1807500-9cc7-403f-ba10-db9bacdae9f1)
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1917] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1922] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1925] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1926] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1929] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1943] device (eth1): disconnecting for new activation request.
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1944] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1947] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1949] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1950] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1952] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1956] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1960] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (2453c22d-6f61-42e5-85d9-d536640dda9d)
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1961] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1963] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1965] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1966] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1969] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1973] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1977] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (2a7b0e3f-5d6a-4918-ac4e-9d3dc2d30b6e)
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1978] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1980] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1982] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1983] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1986] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1990] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1994] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (63311777-9903-4357-845b-90dd8bfaf872)
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1995] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1997] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.1999] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2000] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2003] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2007] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2011] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (4526e221-80ce-4766-b424-c6e5fc20bae4)
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2012] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2015] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2016] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2017] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2019] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2030] audit: op="device-reapply" interface="eth0" ifindex=2 args="802-3-ethernet.mtu,ipv4.dhcp-timeout,ipv4.dhcp-client-id,connection.autoconnect-priority,ipv6.method,ipv6.addr-gen-mode" pid=51763 uid=0 result="success"
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2032] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2035] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2036] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2043] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2045] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2049] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2051] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2052] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2056] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2059] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2061] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2062] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2065] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2068] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2070] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2072] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2075] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2078] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2080] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2082] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2085] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2088] dhcp4 (eth0): canceled DHCP transaction
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2088] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2089] dhcp4 (eth0): state changed no lease
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2090] dhcp4 (eth0): activation: beginning transaction (no timeout)
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2100] audit: op="device-reapply" interface="eth1" ifindex=3 pid=51763 uid=0 result="fail" reason="Device is not activated"
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2137] dhcp4 (eth0): state changed new lease, address=38.102.83.94
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2227] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2236] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Nov 29 01:11:45 np0005539510 kernel: ovs-system: entered promiscuous mode
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2267] device (eth1): disconnecting for new activation request.
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2268] audit: op="connection-activate" uuid="00e95469-28f7-5d90-a077-7f69916381bc" name="ci-private-network" pid=51763 uid=0 result="success"
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2271] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2281] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Nov 29 01:11:45 np0005539510 kernel: Timeout policy base is empty
Nov 29 01:11:45 np0005539510 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2291] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Nov 29 01:11:45 np0005539510 systemd-udevd[51768]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2335] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51763 uid=0 result="success"
Nov 29 01:11:45 np0005539510 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2415] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2510] device (eth1): Activation: starting connection 'ci-private-network' (00e95469-28f7-5d90-a077-7f69916381bc)
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2515] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2525] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2531] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2538] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2543] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2548] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2550] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2552] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2553] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2555] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2557] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2561] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2568] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2573] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2576] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2582] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2586] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2591] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2595] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2600] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2604] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2609] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2613] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2617] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2623] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2627] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2674] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2676] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2683] device (eth1): Activation: successful, device activated.
Nov 29 01:11:45 np0005539510 kernel: br-ex: entered promiscuous mode
Nov 29 01:11:45 np0005539510 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Nov 29 01:11:45 np0005539510 kernel: vlan22: entered promiscuous mode
Nov 29 01:11:45 np0005539510 systemd-udevd[51769]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2875] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2887] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2914] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2916] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.2924] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 01:11:45 np0005539510 kernel: vlan21: entered promiscuous mode
Nov 29 01:11:45 np0005539510 kernel: vlan23: entered promiscuous mode
Nov 29 01:11:45 np0005539510 systemd-udevd[51767]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.3052] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.3064] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.3078] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.3091] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.3095] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.3107] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 kernel: vlan20: entered promiscuous mode
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.3115] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.3162] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.3166] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.3174] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.3217] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.3234] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.3247] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.3261] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.3270] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.3275] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.3284] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.3320] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.3322] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:11:45 np0005539510 NetworkManager[48989]: <info>  [1764396705.3330] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 01:11:46 np0005539510 NetworkManager[48989]: <info>  [1764396706.4565] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51763 uid=0 result="success"
Nov 29 01:11:46 np0005539510 python3.9[52121]: ansible-ansible.legacy.async_status Invoked with jid=j690250472157.51757 mode=status _async_dir=/root/.ansible_async
Nov 29 01:11:46 np0005539510 NetworkManager[48989]: <info>  [1764396706.6757] checkpoint[0x55a93231a950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Nov 29 01:11:46 np0005539510 NetworkManager[48989]: <info>  [1764396706.6759] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51763 uid=0 result="success"
Nov 29 01:11:46 np0005539510 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 29 01:11:47 np0005539510 NetworkManager[48989]: <info>  [1764396707.0092] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51763 uid=0 result="success"
Nov 29 01:11:47 np0005539510 NetworkManager[48989]: <info>  [1764396707.0109] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51763 uid=0 result="success"
Nov 29 01:11:47 np0005539510 NetworkManager[48989]: <info>  [1764396707.2570] audit: op="networking-control" arg="global-dns-configuration" pid=51763 uid=0 result="success"
Nov 29 01:11:47 np0005539510 NetworkManager[48989]: <info>  [1764396707.2599] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Nov 29 01:11:47 np0005539510 NetworkManager[48989]: <info>  [1764396707.2637] audit: op="networking-control" arg="global-dns-configuration" pid=51763 uid=0 result="success"
Nov 29 01:11:47 np0005539510 NetworkManager[48989]: <info>  [1764396707.2665] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51763 uid=0 result="success"
Nov 29 01:11:47 np0005539510 NetworkManager[48989]: <info>  [1764396707.4824] checkpoint[0x55a93231aa20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Nov 29 01:11:47 np0005539510 NetworkManager[48989]: <info>  [1764396707.4829] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51763 uid=0 result="success"
Nov 29 01:11:47 np0005539510 ansible-async_wrapper.py[51761]: Module complete (51761)
Nov 29 01:11:47 np0005539510 ansible-async_wrapper.py[51760]: Done in kid B.
Nov 29 01:11:50 np0005539510 python3.9[52229]: ansible-ansible.legacy.async_status Invoked with jid=j690250472157.51757 mode=status _async_dir=/root/.ansible_async
Nov 29 01:11:50 np0005539510 python3.9[52329]: ansible-ansible.legacy.async_status Invoked with jid=j690250472157.51757 mode=cleanup _async_dir=/root/.ansible_async
Nov 29 01:11:51 np0005539510 python3.9[52481]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:11:52 np0005539510 python3.9[52604]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396711.0874846-933-126930131588677/.source.returncode _original_basename=.1wwpghqs follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:11:53 np0005539510 python3.9[52756]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:11:54 np0005539510 python3.9[52880]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396712.5868278-981-16913086037819/.source.cfg _original_basename=.7zg22jhc follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:11:54 np0005539510 python3.9[53032]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:11:55 np0005539510 systemd[1]: Reloading Network Manager...
Nov 29 01:11:55 np0005539510 NetworkManager[48989]: <info>  [1764396715.0530] audit: op="reload" arg="0" pid=53036 uid=0 result="success"
Nov 29 01:11:55 np0005539510 NetworkManager[48989]: <info>  [1764396715.0537] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Nov 29 01:11:55 np0005539510 systemd[1]: Reloaded Network Manager.
Nov 29 01:11:55 np0005539510 systemd[1]: session-11.scope: Deactivated successfully.
Nov 29 01:11:55 np0005539510 systemd[1]: session-11.scope: Consumed 51.308s CPU time.
Nov 29 01:11:55 np0005539510 systemd-logind[784]: Session 11 logged out. Waiting for processes to exit.
Nov 29 01:11:55 np0005539510 systemd-logind[784]: Removed session 11.
Nov 29 01:12:00 np0005539510 systemd-logind[784]: New session 12 of user zuul.
Nov 29 01:12:00 np0005539510 systemd[1]: Started Session 12 of User zuul.
Nov 29 01:12:01 np0005539510 python3.9[53220]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:12:02 np0005539510 python3.9[53374]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:12:05 np0005539510 python3.9[53568]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:12:05 np0005539510 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 01:12:06 np0005539510 systemd[1]: session-12.scope: Deactivated successfully.
Nov 29 01:12:06 np0005539510 systemd[1]: session-12.scope: Consumed 2.442s CPU time.
Nov 29 01:12:06 np0005539510 systemd-logind[784]: Session 12 logged out. Waiting for processes to exit.
Nov 29 01:12:06 np0005539510 systemd-logind[784]: Removed session 12.
Nov 29 01:12:11 np0005539510 systemd-logind[784]: New session 13 of user zuul.
Nov 29 01:12:11 np0005539510 systemd[1]: Started Session 13 of User zuul.
Nov 29 01:12:13 np0005539510 python3.9[53751]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:12:14 np0005539510 python3.9[53905]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:12:15 np0005539510 python3.9[54061]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:12:16 np0005539510 python3.9[54146]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:12:18 np0005539510 python3.9[54299]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:12:21 np0005539510 python3.9[54494]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:12:22 np0005539510 python3.9[54646]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:12:23 np0005539510 podman[54647]: 2025-11-29 06:12:23.897961758 +0000 UTC m=+1.735448650 system refresh
Nov 29 01:12:24 np0005539510 python3.9[54809]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:12:24 np0005539510 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:12:25 np0005539510 python3.9[54932]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396744.1340225-204-16835673226794/.source.json follow=False _original_basename=podman_network_config.j2 checksum=9d8d6fb48c24217b2ff710035753b7137f3c873e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:12:26 np0005539510 python3.9[55084]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:12:27 np0005539510 python3.9[55207]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764396745.769978-250-154268522424042/.source.conf follow=False _original_basename=registries.conf.j2 checksum=25aa6c560e50dcbd81b989ea46a7865cb55b8998 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:12:28 np0005539510 python3.9[55359]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:12:28 np0005539510 python3.9[55511]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:12:29 np0005539510 python3.9[55663]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:12:30 np0005539510 python3.9[55815]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:12:31 np0005539510 python3.9[55967]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:12:33 np0005539510 python3.9[56120]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:12:34 np0005539510 python3.9[56274]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:12:35 np0005539510 python3.9[56426]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:12:36 np0005539510 python3.9[56578]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:12:38 np0005539510 python3.9[56731]: ansible-service_facts Invoked
Nov 29 01:12:38 np0005539510 network[56748]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:12:38 np0005539510 network[56749]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:12:38 np0005539510 network[56750]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:12:44 np0005539510 python3.9[57202]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:12:48 np0005539510 python3.9[57355]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Nov 29 01:12:50 np0005539510 python3.9[57507]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:12:51 np0005539510 python3.9[57632]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396769.836312-683-93344662891375/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:12:52 np0005539510 python3.9[57786]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:12:52 np0005539510 python3.9[57911]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396771.4855196-728-266827799762230/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:12:54 np0005539510 python3.9[58065]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:12:56 np0005539510 python3.9[58219]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:12:57 np0005539510 python3.9[58303]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:12:59 np0005539510 python3.9[58457]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:12:59 np0005539510 python3.9[58541]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:12:59 np0005539510 chronyd[787]: chronyd exiting
Nov 29 01:12:59 np0005539510 systemd[1]: Stopping NTP client/server...
Nov 29 01:12:59 np0005539510 systemd[1]: chronyd.service: Deactivated successfully.
Nov 29 01:12:59 np0005539510 systemd[1]: Stopped NTP client/server.
Nov 29 01:12:59 np0005539510 systemd[1]: Starting NTP client/server...
Nov 29 01:12:59 np0005539510 chronyd[58550]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 29 01:12:59 np0005539510 chronyd[58550]: Frequency -26.017 +/- 0.314 ppm read from /var/lib/chrony/drift
Nov 29 01:12:59 np0005539510 chronyd[58550]: Loaded seccomp filter (level 2)
Nov 29 01:12:59 np0005539510 systemd[1]: Started NTP client/server.
Nov 29 01:13:00 np0005539510 systemd[1]: session-13.scope: Deactivated successfully.
Nov 29 01:13:00 np0005539510 systemd[1]: session-13.scope: Consumed 28.003s CPU time.
Nov 29 01:13:00 np0005539510 systemd-logind[784]: Session 13 logged out. Waiting for processes to exit.
Nov 29 01:13:00 np0005539510 systemd-logind[784]: Removed session 13.
Nov 29 01:13:06 np0005539510 systemd-logind[784]: New session 14 of user zuul.
Nov 29 01:13:06 np0005539510 systemd[1]: Started Session 14 of User zuul.
Nov 29 01:13:07 np0005539510 python3.9[58731]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:07 np0005539510 python3.9[58883]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:13:08 np0005539510 python3.9[59006]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396787.308909-71-207119636941577/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:09 np0005539510 systemd[1]: session-14.scope: Deactivated successfully.
Nov 29 01:13:09 np0005539510 systemd[1]: session-14.scope: Consumed 1.590s CPU time.
Nov 29 01:13:09 np0005539510 systemd-logind[784]: Session 14 logged out. Waiting for processes to exit.
Nov 29 01:13:09 np0005539510 systemd-logind[784]: Removed session 14.
Nov 29 01:13:14 np0005539510 systemd-logind[784]: New session 15 of user zuul.
Nov 29 01:13:14 np0005539510 systemd[1]: Started Session 15 of User zuul.
Nov 29 01:13:15 np0005539510 python3.9[59184]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:13:16 np0005539510 python3.9[59340]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:17 np0005539510 python3.9[59515]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:13:18 np0005539510 python3.9[59638]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764396797.140119-89-253032209619510/.source.json _original_basename=.ibopix0k follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:19 np0005539510 python3.9[59790]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:13:20 np0005539510 python3.9[59913]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396799.2680554-158-263538102062156/.source _original_basename=.xppcstfp follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:21 np0005539510 python3.9[60065]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:13:22 np0005539510 python3.9[60217]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:13:23 np0005539510 python3.9[60340]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764396801.9480999-231-47786330041578/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:13:23 np0005539510 python3.9[60493]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:13:24 np0005539510 python3.9[60618]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764396803.2062855-231-225834578073087/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:13:25 np0005539510 python3.9[60770]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:26 np0005539510 python3.9[60922]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:13:26 np0005539510 python3.9[61045]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396805.4738147-342-1478881823389/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:27 np0005539510 python3.9[61197]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:13:28 np0005539510 python3.9[61320]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396806.9685643-386-201103530186761/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:29 np0005539510 python3.9[61472]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:13:29 np0005539510 systemd[1]: Reloading.
Nov 29 01:13:29 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:13:29 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:13:29 np0005539510 systemd[1]: Reloading.
Nov 29 01:13:29 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:13:29 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:13:30 np0005539510 systemd[1]: Starting EDPM Container Shutdown...
Nov 29 01:13:30 np0005539510 systemd[1]: Finished EDPM Container Shutdown.
Nov 29 01:13:31 np0005539510 python3.9[61699]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:13:32 np0005539510 python3.9[61822]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396810.8107858-456-252325729249471/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:32 np0005539510 python3.9[61974]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:13:33 np0005539510 python3.9[62097]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396812.2890506-501-46785231949354/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:34 np0005539510 python3.9[62249]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:13:34 np0005539510 systemd[1]: Reloading.
Nov 29 01:13:34 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:13:34 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:13:34 np0005539510 systemd[1]: Reloading.
Nov 29 01:13:34 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:13:34 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:13:34 np0005539510 systemd[1]: Starting Create netns directory...
Nov 29 01:13:34 np0005539510 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 01:13:34 np0005539510 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 01:13:34 np0005539510 systemd[1]: Finished Create netns directory.
Nov 29 01:13:36 np0005539510 python3.9[62476]: ansible-ansible.builtin.service_facts Invoked
Nov 29 01:13:36 np0005539510 network[62493]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:13:36 np0005539510 network[62494]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:13:36 np0005539510 network[62495]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:13:41 np0005539510 python3.9[62757]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:13:41 np0005539510 systemd[1]: Reloading.
Nov 29 01:13:41 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:13:41 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:13:41 np0005539510 systemd[1]: Stopping IPv4 firewall with iptables...
Nov 29 01:13:41 np0005539510 iptables.init[62798]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Nov 29 01:13:41 np0005539510 iptables.init[62798]: iptables: Flushing firewall rules: [  OK  ]
Nov 29 01:13:41 np0005539510 systemd[1]: iptables.service: Deactivated successfully.
Nov 29 01:13:41 np0005539510 systemd[1]: Stopped IPv4 firewall with iptables.
Nov 29 01:13:42 np0005539510 python3.9[62995]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:13:45 np0005539510 python3.9[63149]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:13:45 np0005539510 systemd[1]: Reloading.
Nov 29 01:13:45 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:13:45 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:13:45 np0005539510 systemd[1]: Starting Netfilter Tables...
Nov 29 01:13:45 np0005539510 systemd[1]: Finished Netfilter Tables.
Nov 29 01:13:46 np0005539510 python3.9[63340]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:13:47 np0005539510 python3.9[63493]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:13:48 np0005539510 python3.9[63618]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396827.1172018-708-262227662948596/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:49 np0005539510 python3.9[63771]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:13:49 np0005539510 systemd[1]: Reloading OpenSSH server daemon...
Nov 29 01:13:49 np0005539510 systemd[1]: Reloaded OpenSSH server daemon.
Nov 29 01:13:50 np0005539510 python3.9[63927]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:51 np0005539510 python3.9[64079]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:13:51 np0005539510 python3.9[64202]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396830.5878987-801-137183532355338/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:53 np0005539510 python3.9[64355]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 29 01:13:53 np0005539510 systemd[1]: Starting Time & Date Service...
Nov 29 01:13:53 np0005539510 systemd[1]: Started Time & Date Service.
Nov 29 01:13:54 np0005539510 python3.9[64511]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:55 np0005539510 python3.9[64663]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:13:55 np0005539510 python3.9[64786]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396834.5090587-906-69822310045952/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:56 np0005539510 python3.9[64938]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:13:56 np0005539510 python3.9[65061]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764396836.0172725-951-131113944525259/.source.yaml _original_basename=.kxc91fh0 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:58 np0005539510 python3.9[65213]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:13:58 np0005539510 python3.9[65336]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396837.5864322-995-277978008204517/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:13:59 np0005539510 python3.9[65488]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:14:00 np0005539510 python3.9[65641]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:14:01 np0005539510 python3[65794]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 29 01:14:02 np0005539510 python3.9[65946]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:14:02 np0005539510 python3.9[66069]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396841.9311297-1113-108322063342529/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:14:04 np0005539510 python3.9[66221]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:14:04 np0005539510 python3.9[66344]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396843.5848095-1158-148001169850308/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:14:05 np0005539510 python3.9[66496]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:14:06 np0005539510 python3.9[66619]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396845.1424658-1203-37248642413795/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:14:07 np0005539510 python3.9[66771]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:14:07 np0005539510 python3.9[66894]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396846.6796222-1247-36403981546541/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:14:08 np0005539510 irqbalance[780]: Cannot change IRQ 26 affinity: Operation not permitted
Nov 29 01:14:08 np0005539510 irqbalance[780]: IRQ 26 affinity is now unmanaged
Nov 29 01:14:08 np0005539510 python3.9[67046]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:14:09 np0005539510 python3.9[67169]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764396848.2262766-1293-262802204904238/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:14:10 np0005539510 python3.9[67321]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:14:11 np0005539510 python3.9[67473]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:14:12 np0005539510 python3.9[67632]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:14:13 np0005539510 python3.9[67785]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:14:14 np0005539510 python3.9[67937]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:14:15 np0005539510 python3.9[68089]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 29 01:14:16 np0005539510 python3.9[68242]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 29 01:14:16 np0005539510 systemd-logind[784]: Session 15 logged out. Waiting for processes to exit.
Nov 29 01:14:16 np0005539510 systemd[1]: session-15.scope: Deactivated successfully.
Nov 29 01:14:16 np0005539510 systemd[1]: session-15.scope: Consumed 38.027s CPU time.
Nov 29 01:14:16 np0005539510 systemd-logind[784]: Removed session 15.
Nov 29 01:14:23 np0005539510 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 29 01:14:26 np0005539510 systemd-logind[784]: New session 16 of user zuul.
Nov 29 01:14:26 np0005539510 systemd[1]: Started Session 16 of User zuul.
Nov 29 01:14:27 np0005539510 python3.9[68426]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 29 01:14:28 np0005539510 python3.9[68578]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:14:29 np0005539510 python3.9[68730]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:14:30 np0005539510 python3.9[68882]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCX0dhB1m0xL0qEi5jnTQLLB4bvueVV5foNrqU/OkfV/4gRyp7uP2q21lWq5Dtl2GLk51pS6oD41RI41Y5g7OSRs8b1Z66d6X1QgX0Qns6pv7FwmNSQ25+2VGV6lppnaN5e+JHiwTmzpf82hl/MiiJrHo7B63mllKyl9SZJxUhP9RR4czS3QNYQsZyP7sZeCWothTZ2Q/GK4BWBEtj2+ifeOpa342IivopCH05YVQOx9bpsdFHMYaalMDCwvr2lfVns8aTcpJ3z9uE8wLdKWTyiinT7nuLX6RuPwhXB2proBRH1wrGSIUgcVcizkWn8QizD8LlsGFcHIQJkmq+sJz6r7cCZLIfS6hdAzI+hYbJie6n/agwfxe4r+mbXsmmC6ALKKk7CEnaiNnDg0fgTaUfBPwSfu+JmVrjdSO+S8f/CMbtYeO6QknOxhLV9oK6knszv7nLlSYXTzXanHkN4Y0fW3dsSvoE+qDR0YijbbT8slqMd6z95wWVDFUmTcN8Nzk8=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILci1PI4hoB56+xxS5gSMKceuJ/dv6t7etpmtENwoSFr#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJIaOLr2ntjSUcigXC7a0sFoonsuh0ChCx2a1R6G8EDmJ8/ZB8NEiJE6KAQJDNU5XsXjuaC44eJhOUMRK9r98xA=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC2GXKCQiCwQEMihcSwDVeJtG2CpTemmA6MTbtOkxbB3OAV5PK8v8imPvDGMDurfGFQG0RzWyv9szlMJXdgIkwejIfy/AY7p6nemHOpu6DdAx0EA/jg1YcOIeeEhyMw1/oFzjYClGMohaI1oTKHtR29UXWphTAroOkf26Exvco6hh2ApRTXV9ObzSoOyCC7+OZcOWgYzdoCfu/0FDGkH2ksKLQS7d4AAh/XZ/njXhK57U7ptxHCReUPECGRv7KB4f8TelZDAIeUyp7ngd/9ivUDO1zue1Qr9ECzTzAFqippGXFmYl3+oSid03CY7bqnxav4xWt7UukbaO57goyIPfkklPdC1kA7kZqa9bqeDU1WgDkqnLu8hluArB0Y0Jz+hDfx9pTbAL6MklraoLaGrnrgcibAollAN+7WGqdWxUotENYaljO7P1Z18MlNllWFzk4Le5jMLNL8qArSlzM+ufOThnLdGEuYZhH1x969AisGQ4MQWn0P0lZFu6fE5VSNA/k=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDdPWx5WoFJTxz6PiFZL5f3XrtE682RjGFiIpoe0LXZO#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFQlZMweHfLYiJFtm1r2tQze/oNx6KzgaXkK+Kof7POk0cFMLbTsXU8qgbQMh4o5LVO0Hbas4mAqxRkGcFCg2Po=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUVpPatup3d17omeiTdJaYR8jCcDbraJSPBxWy49Wxst4G+6/lD41HVIKmjgCgIbbmYSFBPQmoXt4gFXP4FRKna6AbQWi0kwF3/T2biQ2qCid0HVDSS8YRVlyrpdVc1/bIg6YNLkGnhzOMp0S1443+cg5PqutAbrAT1LOg6lSBu+K9gIqJ4un3l2guSweoyba5UhMyjrq4Pffx1QCuBggtYSjmA9Q1r5VVNc2J7AbP0QuzOe6J6DhpdGJsfmHDVXZb/4b/aPUdCTKkLseyUtcqElWVhhnGnpYSJdN81ejalSktGHE4JRHih19wwTokiKvoczUgijBzOfl+kt2ELcpDgzpzY0M9yd0Zz7wrK4rLM6hi8x3LYZXZv8N7KnawUcJ2jfzilx1BVLdNzgwDNB7ZlP4O9Vs3fKnBufCUFPNcRyWl6ooczepbgxqgSbr/Ham2O4/qzvJmzLtu0KxBkaFALRWnyM39nYVE/jrMKJ5ihtVDxIY9FGma/Jifg15gqI0=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIN19pK3a7AH/OiwlqJTVWP/qzU/QzkC16s4D1xY1Vn6J#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLsXsjJNPVMX1YVTe2oBmcZpUSiv3HOeuICgZtQun4hTopMXH9dE1jQeUruGwqZ+NsKW6X2bLZZJ0/tcn2owL8Q=#012 create=True mode=0644 path=/tmp/ansible.ayjbkyvy state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:14:31 np0005539510 python3.9[69034]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.ayjbkyvy' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:14:32 np0005539510 python3.9[69188]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.ayjbkyvy state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:14:32 np0005539510 systemd[1]: session-16.scope: Deactivated successfully.
Nov 29 01:14:32 np0005539510 systemd[1]: session-16.scope: Consumed 3.688s CPU time.
Nov 29 01:14:32 np0005539510 systemd-logind[784]: Session 16 logged out. Waiting for processes to exit.
Nov 29 01:14:32 np0005539510 systemd-logind[784]: Removed session 16.
Nov 29 01:14:39 np0005539510 systemd-logind[784]: New session 17 of user zuul.
Nov 29 01:14:39 np0005539510 systemd[1]: Started Session 17 of User zuul.
Nov 29 01:14:40 np0005539510 python3.9[69366]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:14:41 np0005539510 python3.9[69522]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 29 01:14:42 np0005539510 python3.9[69676]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:14:44 np0005539510 python3.9[69829]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:14:45 np0005539510 python3.9[69982]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:14:46 np0005539510 python3.9[70136]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:14:47 np0005539510 python3.9[70291]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:14:47 np0005539510 systemd[1]: session-17.scope: Deactivated successfully.
Nov 29 01:14:47 np0005539510 systemd[1]: session-17.scope: Consumed 4.541s CPU time.
Nov 29 01:14:47 np0005539510 systemd-logind[784]: Session 17 logged out. Waiting for processes to exit.
Nov 29 01:14:47 np0005539510 systemd-logind[784]: Removed session 17.
Nov 29 01:14:53 np0005539510 systemd-logind[784]: New session 18 of user zuul.
Nov 29 01:14:53 np0005539510 systemd[1]: Started Session 18 of User zuul.
Nov 29 01:14:54 np0005539510 python3.9[70469]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:14:55 np0005539510 python3.9[70625]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:14:56 np0005539510 python3.9[70709]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 01:14:58 np0005539510 python3.9[70860]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:14:59 np0005539510 python3.9[71011]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 01:15:00 np0005539510 python3.9[71161]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:15:00 np0005539510 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:15:00 np0005539510 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:15:01 np0005539510 python3.9[71312]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:15:02 np0005539510 systemd[1]: session-18.scope: Deactivated successfully.
Nov 29 01:15:02 np0005539510 systemd[1]: session-18.scope: Consumed 5.568s CPU time.
Nov 29 01:15:02 np0005539510 systemd-logind[784]: Session 18 logged out. Waiting for processes to exit.
Nov 29 01:15:02 np0005539510 systemd-logind[784]: Removed session 18.
Nov 29 01:15:08 np0005539510 chronyd[58550]: Selected source 142.4.192.253 (pool.ntp.org)
Nov 29 01:15:11 np0005539510 systemd-logind[784]: New session 19 of user zuul.
Nov 29 01:15:11 np0005539510 systemd[1]: Started Session 19 of User zuul.
Nov 29 01:15:18 np0005539510 python3[72078]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:15:20 np0005539510 python3[72173]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 29 01:15:21 np0005539510 python3[72200]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 29 01:15:22 np0005539510 python3[72226]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=7G#012losetup /dev/loop3 /var/lib/ceph-osd-0.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:15:22 np0005539510 kernel: loop: module loaded
Nov 29 01:15:22 np0005539510 kernel: loop3: detected capacity change from 0 to 14680064
Nov 29 01:15:22 np0005539510 python3[72261]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3#012vgcreate ceph_vg0 /dev/loop3#012lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:15:22 np0005539510 lvm[72264]: PV /dev/loop3 not used.
Nov 29 01:15:22 np0005539510 lvm[72273]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 29 01:15:22 np0005539510 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Nov 29 01:15:22 np0005539510 lvm[72275]:  1 logical volume(s) in volume group "ceph_vg0" now active
Nov 29 01:15:23 np0005539510 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Nov 29 01:15:24 np0005539510 python3[72353]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 01:15:24 np0005539510 python3[72426]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764396923.9633152-37030-191459489699471/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:15:25 np0005539510 python3[72476]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:15:25 np0005539510 systemd[1]: Reloading.
Nov 29 01:15:25 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:15:25 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:15:25 np0005539510 systemd[1]: Starting Ceph OSD losetup...
Nov 29 01:15:25 np0005539510 bash[72516]: /dev/loop3: [64513]:4327940 (/var/lib/ceph-osd-0.img)
Nov 29 01:15:25 np0005539510 systemd[1]: Finished Ceph OSD losetup.
Nov 29 01:15:25 np0005539510 lvm[72518]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 29 01:15:25 np0005539510 lvm[72518]: VG ceph_vg0 finished
Nov 29 01:15:28 np0005539510 python3[72542]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:17:35 np0005539510 systemd[1]: Created slice User Slice of UID 42477.
Nov 29 01:17:35 np0005539510 systemd[1]: Starting User Runtime Directory /run/user/42477...
Nov 29 01:17:35 np0005539510 systemd-logind[784]: New session 20 of user ceph-admin.
Nov 29 01:17:35 np0005539510 systemd[1]: Finished User Runtime Directory /run/user/42477.
Nov 29 01:17:35 np0005539510 systemd[1]: Starting User Manager for UID 42477...
Nov 29 01:17:35 np0005539510 systemd[72593]: Queued start job for default target Main User Target.
Nov 29 01:17:35 np0005539510 systemd[72593]: Created slice User Application Slice.
Nov 29 01:17:35 np0005539510 systemd[72593]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 01:17:35 np0005539510 systemd[72593]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 01:17:35 np0005539510 systemd[72593]: Reached target Paths.
Nov 29 01:17:35 np0005539510 systemd[72593]: Reached target Timers.
Nov 29 01:17:35 np0005539510 systemd[72593]: Starting D-Bus User Message Bus Socket...
Nov 29 01:17:35 np0005539510 systemd[72593]: Starting Create User's Volatile Files and Directories...
Nov 29 01:17:35 np0005539510 systemd-logind[784]: New session 22 of user ceph-admin.
Nov 29 01:17:35 np0005539510 systemd[72593]: Finished Create User's Volatile Files and Directories.
Nov 29 01:17:35 np0005539510 systemd[72593]: Listening on D-Bus User Message Bus Socket.
Nov 29 01:17:35 np0005539510 systemd[72593]: Reached target Sockets.
Nov 29 01:17:35 np0005539510 systemd[72593]: Reached target Basic System.
Nov 29 01:17:35 np0005539510 systemd[72593]: Reached target Main User Target.
Nov 29 01:17:35 np0005539510 systemd[72593]: Startup finished in 113ms.
Nov 29 01:17:35 np0005539510 systemd[1]: Started User Manager for UID 42477.
Nov 29 01:17:35 np0005539510 systemd[1]: Started Session 20 of User ceph-admin.
Nov 29 01:17:35 np0005539510 systemd[1]: Started Session 22 of User ceph-admin.
Nov 29 01:17:36 np0005539510 systemd-logind[784]: New session 23 of user ceph-admin.
Nov 29 01:17:36 np0005539510 systemd[1]: Started Session 23 of User ceph-admin.
Nov 29 01:17:36 np0005539510 systemd-logind[784]: New session 24 of user ceph-admin.
Nov 29 01:17:36 np0005539510 systemd[1]: Started Session 24 of User ceph-admin.
Nov 29 01:17:37 np0005539510 systemd-logind[784]: New session 25 of user ceph-admin.
Nov 29 01:17:37 np0005539510 systemd[1]: Started Session 25 of User ceph-admin.
Nov 29 01:17:37 np0005539510 systemd-logind[784]: New session 26 of user ceph-admin.
Nov 29 01:17:37 np0005539510 systemd[1]: Started Session 26 of User ceph-admin.
Nov 29 01:17:38 np0005539510 systemd-logind[784]: New session 27 of user ceph-admin.
Nov 29 01:17:38 np0005539510 systemd[1]: Started Session 27 of User ceph-admin.
Nov 29 01:17:38 np0005539510 systemd-logind[784]: New session 28 of user ceph-admin.
Nov 29 01:17:38 np0005539510 systemd[1]: Started Session 28 of User ceph-admin.
Nov 29 01:17:38 np0005539510 systemd-logind[784]: New session 29 of user ceph-admin.
Nov 29 01:17:38 np0005539510 systemd[1]: Started Session 29 of User ceph-admin.
Nov 29 01:17:39 np0005539510 systemd-logind[784]: New session 30 of user ceph-admin.
Nov 29 01:17:39 np0005539510 systemd[1]: Started Session 30 of User ceph-admin.
Nov 29 01:17:39 np0005539510 systemd-logind[784]: New session 31 of user ceph-admin.
Nov 29 01:17:39 np0005539510 systemd[1]: Started Session 31 of User ceph-admin.
Nov 29 01:17:40 np0005539510 systemd-logind[784]: New session 32 of user ceph-admin.
Nov 29 01:17:40 np0005539510 systemd[1]: Started Session 32 of User ceph-admin.
Nov 29 01:17:40 np0005539510 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:18:38 np0005539510 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:18:38 np0005539510 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:18:39 np0005539510 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:18:39 np0005539510 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:18:39 np0005539510 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 73617 (sysctl)
Nov 29 01:18:39 np0005539510 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Nov 29 01:18:39 np0005539510 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Nov 29 01:18:40 np0005539510 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:18:41 np0005539510 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:18:45 np0005539510 systemd[1]: var-lib-containers-storage-overlay-compat804025874-lower\x2dmapped.mount: Deactivated successfully.
Nov 29 01:19:09 np0005539510 podman[73893]: 2025-11-29 06:19:09.982071152 +0000 UTC m=+28.714164910 container create 51769b00f494f2370e7a4e262bcc935c66ebdac3ba702b31a3e0710dff7d714b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_swirles, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Nov 29 01:19:10 np0005539510 systemd[1]: Created slice Virtual Machine and Container Slice.
Nov 29 01:19:10 np0005539510 systemd[1]: Started libpod-conmon-51769b00f494f2370e7a4e262bcc935c66ebdac3ba702b31a3e0710dff7d714b.scope.
Nov 29 01:19:10 np0005539510 podman[73893]: 2025-11-29 06:19:09.968078728 +0000 UTC m=+28.700172526 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:19:10 np0005539510 systemd[1]: Started libcrun container.
Nov 29 01:19:10 np0005539510 podman[73893]: 2025-11-29 06:19:10.076261001 +0000 UTC m=+28.808354819 container init 51769b00f494f2370e7a4e262bcc935c66ebdac3ba702b31a3e0710dff7d714b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_swirles, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507)
Nov 29 01:19:10 np0005539510 podman[73893]: 2025-11-29 06:19:10.084791342 +0000 UTC m=+28.816885120 container start 51769b00f494f2370e7a4e262bcc935c66ebdac3ba702b31a3e0710dff7d714b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_swirles, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS)
Nov 29 01:19:10 np0005539510 podman[73893]: 2025-11-29 06:19:10.088200561 +0000 UTC m=+28.820294379 container attach 51769b00f494f2370e7a4e262bcc935c66ebdac3ba702b31a3e0710dff7d714b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_swirles, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3)
Nov 29 01:19:10 np0005539510 recursing_swirles[73955]: 167 167
Nov 29 01:19:10 np0005539510 systemd[1]: libpod-51769b00f494f2370e7a4e262bcc935c66ebdac3ba702b31a3e0710dff7d714b.scope: Deactivated successfully.
Nov 29 01:19:10 np0005539510 podman[73893]: 2025-11-29 06:19:10.091596909 +0000 UTC m=+28.823690677 container died 51769b00f494f2370e7a4e262bcc935c66ebdac3ba702b31a3e0710dff7d714b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_swirles, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 01:19:10 np0005539510 systemd[1]: var-lib-containers-storage-overlay-5c0c0d2e07127ddb4aeb2fa9832df785b317e2db297c37e229627d9ea11c1aca-merged.mount: Deactivated successfully.
Nov 29 01:19:10 np0005539510 podman[73893]: 2025-11-29 06:19:10.129664939 +0000 UTC m=+28.861758707 container remove 51769b00f494f2370e7a4e262bcc935c66ebdac3ba702b31a3e0710dff7d714b (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=recursing_swirles, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:19:10 np0005539510 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:19:10 np0005539510 systemd[1]: libpod-conmon-51769b00f494f2370e7a4e262bcc935c66ebdac3ba702b31a3e0710dff7d714b.scope: Deactivated successfully.
Nov 29 01:19:10 np0005539510 podman[73980]: 2025-11-29 06:19:10.305765217 +0000 UTC m=+0.052567218 container create 2bbc87a062441edc7f474df4e1fa84ba8746b785e6a04203c4959f5692849371 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_wilson, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 01:19:10 np0005539510 systemd[1]: Started libpod-conmon-2bbc87a062441edc7f474df4e1fa84ba8746b785e6a04203c4959f5692849371.scope.
Nov 29 01:19:10 np0005539510 systemd[1]: Started libcrun container.
Nov 29 01:19:10 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f7dc80d8b1dac7ac36b089ee3254f9af9877e161a8d97dbd0fc1cf37c2fdb76/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:10 np0005539510 podman[73980]: 2025-11-29 06:19:10.280275284 +0000 UTC m=+0.027077305 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:19:10 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f7dc80d8b1dac7ac36b089ee3254f9af9877e161a8d97dbd0fc1cf37c2fdb76/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:10 np0005539510 podman[73980]: 2025-11-29 06:19:10.409298299 +0000 UTC m=+0.156100330 container init 2bbc87a062441edc7f474df4e1fa84ba8746b785e6a04203c4959f5692849371 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_wilson, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:19:10 np0005539510 podman[73980]: 2025-11-29 06:19:10.420413547 +0000 UTC m=+0.167215548 container start 2bbc87a062441edc7f474df4e1fa84ba8746b785e6a04203c4959f5692849371 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_wilson, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.39.3)
Nov 29 01:19:10 np0005539510 podman[73980]: 2025-11-29 06:19:10.425222552 +0000 UTC m=+0.172024533 container attach 2bbc87a062441edc7f474df4e1fa84ba8746b785e6a04203c4959f5692849371 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_wilson, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]: [
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:    {
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:        "available": false,
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:        "ceph_device": false,
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:        "device_id": "QEMU_DVD-ROM_QM00001",
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:        "lsm_data": {},
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:        "lvs": [],
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:        "path": "/dev/sr0",
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:        "rejected_reasons": [
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:            "Insufficient space (<5GB)",
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:            "Has a FileSystem"
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:        ],
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:        "sys_api": {
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:            "actuators": null,
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:            "device_nodes": "sr0",
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:            "devname": "sr0",
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:            "human_readable_size": "482.00 KB",
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:            "id_bus": "ata",
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:            "model": "QEMU DVD-ROM",
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:            "nr_requests": "2",
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:            "parent": "/dev/sr0",
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:            "partitions": {},
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:            "path": "/dev/sr0",
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:            "removable": "1",
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:            "rev": "2.5+",
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:            "ro": "0",
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:            "rotational": "1",
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:            "sas_address": "",
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:            "sas_device_handle": "",
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:            "scheduler_mode": "mq-deadline",
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:            "sectors": 0,
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:            "sectorsize": "2048",
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:            "size": 493568.0,
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:            "support_discard": "2048",
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:            "type": "disk",
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:            "vendor": "QEMU"
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:        }
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]:    }
Nov 29 01:19:11 np0005539510 sweet_wilson[73996]: ]
Nov 29 01:19:11 np0005539510 systemd[1]: libpod-2bbc87a062441edc7f474df4e1fa84ba8746b785e6a04203c4959f5692849371.scope: Deactivated successfully.
Nov 29 01:19:11 np0005539510 systemd[1]: libpod-2bbc87a062441edc7f474df4e1fa84ba8746b785e6a04203c4959f5692849371.scope: Consumed 1.148s CPU time.
Nov 29 01:19:11 np0005539510 podman[73980]: 2025-11-29 06:19:11.568396682 +0000 UTC m=+1.315198663 container died 2bbc87a062441edc7f474df4e1fa84ba8746b785e6a04203c4959f5692849371 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_wilson, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Nov 29 01:19:11 np0005539510 systemd[1]: var-lib-containers-storage-overlay-6f7dc80d8b1dac7ac36b089ee3254f9af9877e161a8d97dbd0fc1cf37c2fdb76-merged.mount: Deactivated successfully.
Nov 29 01:19:11 np0005539510 podman[73980]: 2025-11-29 06:19:11.629961712 +0000 UTC m=+1.376763693 container remove 2bbc87a062441edc7f474df4e1fa84ba8746b785e6a04203c4959f5692849371 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sweet_wilson, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_REF=reef)
Nov 29 01:19:11 np0005539510 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:19:11 np0005539510 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:19:11 np0005539510 systemd[1]: libpod-conmon-2bbc87a062441edc7f474df4e1fa84ba8746b785e6a04203c4959f5692849371.scope: Deactivated successfully.
Nov 29 01:19:16 np0005539510 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:19:16 np0005539510 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:19:16 np0005539510 podman[76788]: 2025-11-29 06:19:16.674019362 +0000 UTC m=+0.045522005 container create ab3e289e0f9f0df752b30822e3a956b9fcb726661e0fd0016332d98a841503cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_nightingale, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, ceph=True, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Nov 29 01:19:16 np0005539510 systemd[1]: Started libpod-conmon-ab3e289e0f9f0df752b30822e3a956b9fcb726661e0fd0016332d98a841503cd.scope.
Nov 29 01:19:16 np0005539510 systemd[1]: Started libcrun container.
Nov 29 01:19:16 np0005539510 podman[76788]: 2025-11-29 06:19:16.733274262 +0000 UTC m=+0.104776915 container init ab3e289e0f9f0df752b30822e3a956b9fcb726661e0fd0016332d98a841503cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_nightingale, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 01:19:16 np0005539510 podman[76788]: 2025-11-29 06:19:16.741303311 +0000 UTC m=+0.112805974 container start ab3e289e0f9f0df752b30822e3a956b9fcb726661e0fd0016332d98a841503cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_nightingale, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Nov 29 01:19:16 np0005539510 podman[76788]: 2025-11-29 06:19:16.647233545 +0000 UTC m=+0.018736228 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:19:16 np0005539510 podman[76788]: 2025-11-29 06:19:16.745577662 +0000 UTC m=+0.117080375 container attach ab3e289e0f9f0df752b30822e3a956b9fcb726661e0fd0016332d98a841503cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_nightingale, ceph=True, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 01:19:16 np0005539510 intelligent_nightingale[76804]: 167 167
Nov 29 01:19:16 np0005539510 systemd[1]: libpod-ab3e289e0f9f0df752b30822e3a956b9fcb726661e0fd0016332d98a841503cd.scope: Deactivated successfully.
Nov 29 01:19:16 np0005539510 podman[76788]: 2025-11-29 06:19:16.748790136 +0000 UTC m=+0.120292809 container died ab3e289e0f9f0df752b30822e3a956b9fcb726661e0fd0016332d98a841503cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_nightingale, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 01:19:16 np0005539510 podman[76788]: 2025-11-29 06:19:16.783861117 +0000 UTC m=+0.155363770 container remove ab3e289e0f9f0df752b30822e3a956b9fcb726661e0fd0016332d98a841503cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_nightingale, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 01:19:16 np0005539510 systemd[1]: libpod-conmon-ab3e289e0f9f0df752b30822e3a956b9fcb726661e0fd0016332d98a841503cd.scope: Deactivated successfully.
Nov 29 01:19:16 np0005539510 podman[76824]: 2025-11-29 06:19:16.847458251 +0000 UTC m=+0.037959908 container create 4032b23a48b52c51433f27fdeb01e3351e090fdaf79f1ee1404a6b92ece42c50 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_matsumoto, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Nov 29 01:19:16 np0005539510 systemd[1]: Started libpod-conmon-4032b23a48b52c51433f27fdeb01e3351e090fdaf79f1ee1404a6b92ece42c50.scope.
Nov 29 01:19:16 np0005539510 systemd[1]: Started libcrun container.
Nov 29 01:19:16 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af18896dca3eb0eff10a9ef81f8d1123d450ac2a35305a029259dad8d4bee463/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:16 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af18896dca3eb0eff10a9ef81f8d1123d450ac2a35305a029259dad8d4bee463/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:16 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af18896dca3eb0eff10a9ef81f8d1123d450ac2a35305a029259dad8d4bee463/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:16 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af18896dca3eb0eff10a9ef81f8d1123d450ac2a35305a029259dad8d4bee463/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:16 np0005539510 podman[76824]: 2025-11-29 06:19:16.903007035 +0000 UTC m=+0.093508712 container init 4032b23a48b52c51433f27fdeb01e3351e090fdaf79f1ee1404a6b92ece42c50 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_matsumoto, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 01:19:16 np0005539510 podman[76824]: 2025-11-29 06:19:16.910714135 +0000 UTC m=+0.101215822 container start 4032b23a48b52c51433f27fdeb01e3351e090fdaf79f1ee1404a6b92ece42c50 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_matsumoto, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Nov 29 01:19:16 np0005539510 podman[76824]: 2025-11-29 06:19:16.914300378 +0000 UTC m=+0.104802045 container attach 4032b23a48b52c51433f27fdeb01e3351e090fdaf79f1ee1404a6b92ece42c50 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_matsumoto, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 01:19:16 np0005539510 podman[76824]: 2025-11-29 06:19:16.830007447 +0000 UTC m=+0.020509124 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:19:16 np0005539510 systemd[1]: libpod-4032b23a48b52c51433f27fdeb01e3351e090fdaf79f1ee1404a6b92ece42c50.scope: Deactivated successfully.
Nov 29 01:19:17 np0005539510 podman[76866]: 2025-11-29 06:19:17.034228446 +0000 UTC m=+0.021924041 container died 4032b23a48b52c51433f27fdeb01e3351e090fdaf79f1ee1404a6b92ece42c50 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_matsumoto, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 01:19:17 np0005539510 podman[76866]: 2025-11-29 06:19:17.066011832 +0000 UTC m=+0.053707417 container remove 4032b23a48b52c51433f27fdeb01e3351e090fdaf79f1ee1404a6b92ece42c50 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=silly_matsumoto, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 01:19:17 np0005539510 systemd[1]: libpod-conmon-4032b23a48b52c51433f27fdeb01e3351e090fdaf79f1ee1404a6b92ece42c50.scope: Deactivated successfully.
Nov 29 01:19:17 np0005539510 systemd[1]: Reloading.
Nov 29 01:19:17 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:19:17 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:19:17 np0005539510 systemd[1]: Reloading.
Nov 29 01:19:17 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:19:17 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:19:17 np0005539510 systemd[1]: Reached target All Ceph clusters and services.
Nov 29 01:19:17 np0005539510 systemd[1]: Reloading.
Nov 29 01:19:17 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:19:17 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:19:17 np0005539510 systemd[1]: Reached target Ceph cluster 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 01:19:17 np0005539510 systemd[1]: Reloading.
Nov 29 01:19:17 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:19:17 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:19:18 np0005539510 systemd[1]: Reloading.
Nov 29 01:19:18 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:19:18 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:19:18 np0005539510 systemd[1]: Created slice Slice /system/ceph-336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 01:19:18 np0005539510 systemd[1]: Reached target System Time Set.
Nov 29 01:19:18 np0005539510 systemd[1]: Reached target System Time Synchronized.
Nov 29 01:19:18 np0005539510 systemd[1]: Starting Ceph mon.compute-2 for 336ec58c-893b-528f-a0c1-6ed1196bc047...
Nov 29 01:19:18 np0005539510 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:19:18 np0005539510 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:19:18 np0005539510 podman[77122]: 2025-11-29 06:19:18.552103687 +0000 UTC m=+0.036385177 container create 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3)
Nov 29 01:19:18 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d31412971db94b5a28cc98a3068e2335e13b119df5a46bfd577c8c751af35ed6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:18 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d31412971db94b5a28cc98a3068e2335e13b119df5a46bfd577c8c751af35ed6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:18 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d31412971db94b5a28cc98a3068e2335e13b119df5a46bfd577c8c751af35ed6/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:18 np0005539510 podman[77122]: 2025-11-29 06:19:18.605289599 +0000 UTC m=+0.089571109 container init 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 01:19:18 np0005539510 podman[77122]: 2025-11-29 06:19:18.60992894 +0000 UTC m=+0.094210430 container start 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 01:19:18 np0005539510 bash[77122]: 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1
Nov 29 01:19:18 np0005539510 podman[77122]: 2025-11-29 06:19:18.536577123 +0000 UTC m=+0.020858643 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:19:18 np0005539510 systemd[1]: Started Ceph mon.compute-2 for 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: set uid:gid to 167:167 (ceph:ceph)
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mon, pid 2
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: pidfile_write: ignore empty --pid-file
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: load: jerasure load: lrc 
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: RocksDB version: 7.9.2
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: Git sha 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: Compile date 2025-05-06 23:30:25
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: DB SUMMARY
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: DB Session ID:  VR5455MVOXQY2YZBKO9G
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: CURRENT file:  CURRENT
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: IDENTITY file:  IDENTITY
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-2/store.db dir, Total Num: 0, files: 
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-2/store.db: 000004.log size: 511 ; 
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                         Options.error_if_exists: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                       Options.create_if_missing: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                         Options.paranoid_checks: 1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                                     Options.env: 0x55be8793bc40
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                                      Options.fs: PosixFileSystem
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                                Options.info_log: 0x55be896fafc0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                Options.max_file_opening_threads: 16
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                              Options.statistics: (nil)
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                               Options.use_fsync: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                       Options.max_log_file_size: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                         Options.allow_fallocate: 1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                        Options.use_direct_reads: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:          Options.create_missing_column_families: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                              Options.db_log_dir: 
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                                 Options.wal_dir: 
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                   Options.advise_random_on_open: 1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                    Options.write_buffer_manager: 0x55be8970ab40
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                            Options.rate_limiter: (nil)
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                  Options.unordered_write: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                               Options.row_cache: None
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                              Options.wal_filter: None
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:             Options.allow_ingest_behind: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:             Options.two_write_queues: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:             Options.manual_wal_flush: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:             Options.wal_compression: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:             Options.atomic_flush: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                 Options.log_readahead_size: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:             Options.allow_data_in_errors: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:             Options.db_host_id: __hostname__
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:             Options.max_background_jobs: 2
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:             Options.max_background_compactions: -1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:             Options.max_subcompactions: 1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:             Options.max_total_wal_size: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                          Options.max_open_files: -1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                          Options.bytes_per_sync: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:       Options.compaction_readahead_size: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                  Options.max_background_flushes: -1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: Compression algorithms supported:
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: #011kZSTD supported: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: #011kXpressCompression supported: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: #011kBZip2Compression supported: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: #011kLZ4Compression supported: 1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: #011kZlibCompression supported: 1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: #011kLZ4HCCompression supported: 1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: #011kSnappyCompression supported: 1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:           Options.merge_operator: 
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:        Options.compaction_filter: None
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55be896fac00)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55be896f31f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:        Options.write_buffer_size: 33554432
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:  Options.max_write_buffer_number: 2
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:          Options.compression: NoCompression
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:             Options.num_levels: 7
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 70291dfa-fb4b-4030-8b2f-275b626805e0
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397158651407, "job": 1, "event": "recovery_started", "wal_files": [4]}
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397158653206, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1648, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 523, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 401, "raw_average_value_size": 80, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397158, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397158653305, "job": 1, "event": "recovery_finished"}
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55be8971ce00
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: DB pointer 0x55be897a6000
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2 does not exist in monmap, will attempt to join an existing cluster
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      1/0    1.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Sum      1/0    1.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.13 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.13 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55be896f31f0#2 capacity: 512.00 MB usage: 0.86 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 4.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(2,0.64 KB,0.00012219%)#012#012** File Read Latency Histogram By Level [default] **
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: using public_addr v2:192.168.122.102:0/0 -> [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0]
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: starting mon.compute-2 rank -1 at public addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] at bind addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-2 fsid 336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(???) e0 preinit fsid 336ec58c-893b-528f-a0c1-6ed1196bc047
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).mds e1 new map
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).mds e1 print_map#012e1#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: -1#012 #012No filesystems configured
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e4 e4: 1 total, 0 up, 1 in
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e5 e5: 2 total, 0 up, 2 in
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e6 e6: 2 total, 0 up, 2 in
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e7 e7: 2 total, 0 up, 2 in
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e8 e8: 2 total, 1 up, 2 in
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e9 e9: 2 total, 1 up, 2 in
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e10 e10: 2 total, 2 up, 2 in
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e11 e11: 2 total, 2 up, 2 in
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e12 e12: 2 total, 2 up, 2 in
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e13 e13: 2 total, 2 up, 2 in
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e14 e14: 2 total, 2 up, 2 in
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e15 e15: 2 total, 2 up, 2 in
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e16 e16: 2 total, 2 up, 2 in
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e17 e17: 2 total, 2 up, 2 in
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e18 e18: 2 total, 2 up, 2 in
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e19 e19: 2 total, 2 up, 2 in
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e20 e20: 2 total, 2 up, 2 in
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e21 e21: 2 total, 2 up, 2 in
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e22 e22: 2 total, 2 up, 2 in
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e23 e23: 2 total, 2 up, 2 in
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e24 e24: 2 total, 2 up, 2 in
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e25 e25: 2 total, 2 up, 2 in
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e26 e26: 2 total, 2 up, 2 in
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e27 e27: 2 total, 2 up, 2 in
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e28 e28: 2 total, 2 up, 2 in
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e29 e29: 2 total, 2 up, 2 in
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e29 crush map has features 3314933000852226048, adjusting msgr requires
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e29 crush map has features 288514051259236352, adjusting msgr requires
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e29 crush map has features 288514051259236352, adjusting msgr requires
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).osd e29 crush map has features 288514051259236352, adjusting msgr requires
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/577122409' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]': finished
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/577122409' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/1457732535' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/1457732535' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/2491487437' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/2491487437' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: Health check update: 4 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/2900095816' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/2900095816' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/956031255' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/956031255' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/2774593808' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]: dispatch
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: Health check update: 6 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/2774593808' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]': finished
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/3785446785' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]: dispatch
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/3785446785' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]': finished
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/3924631149' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]: dispatch
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/3924631149' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: Updating compute-2:/etc/ceph/ceph.conf
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: Health check update: 4 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/935132046' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: Updating compute-2:/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/935132046' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/1714792720' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: Updating compute-2:/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.client.admin.keyring
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/1714792720' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: Deploying daemon mon.compute-2 on compute-2
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: Health check cleared: CEPHADM_APPLY_SPEC_FAIL (was: Failed to apply 2 service(s): mon,mgr)
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: Health check cleared: CEPHADM_REFRESH_FAILED (was: failed to probe daemons or devices)
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/2338482810' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: Health check update: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/2338482810' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Nov 29 01:19:18 np0005539510 ceph-mon[77142]: mon.compute-2@-1(synchronizing).paxosservice(auth 1..7) refresh upgraded, format 0 -> 3
Nov 29 01:19:20 np0005539510 ceph-mon[77142]: mon.compute-2@-1(probing) e2  my rank is now 1 (was -1)
Nov 29 01:19:20 np0005539510 ceph-mon[77142]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Nov 29 01:19:20 np0005539510 ceph-mon[77142]: paxos.1).electionLogic(0) init, first boot, initializing epoch at 1 
Nov 29 01:19:20 np0005539510 ceph-mon[77142]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:19:21 np0005539510 ceph-mon[77142]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Nov 29 01:19:22 np0005539510 ceph-mon[77142]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Nov 29 01:19:23 np0005539510 ceph-mon[77142]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:19:23 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Nov 29 01:19:23 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Nov 29 01:19:23 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:19:23 np0005539510 ceph-mon[77142]: mgrc update_daemon_metadata mon.compute-2 metadata {addrs=[v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),ceph_version_short=18.2.7,ceph_version_when_created=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0,cpu=AMD EPYC-Rome Processor,created_at=2025-11-29T06:19:16.947562Z,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-2,kernel_description=#1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025,kernel_version=5.14.0-642.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864320,os=Linux}
Nov 29 01:19:24 np0005539510 ceph-mon[77142]: Deploying daemon mon.compute-1 on compute-1
Nov 29 01:19:24 np0005539510 ceph-mon[77142]: mon.compute-0 calling monitor election
Nov 29 01:19:24 np0005539510 ceph-mon[77142]: mon.compute-2 calling monitor election
Nov 29 01:19:24 np0005539510 ceph-mon[77142]: mon.compute-0 is new leader, mons compute-0,compute-2 in quorum (ranks 0,1)
Nov 29 01:19:24 np0005539510 ceph-mon[77142]: overall HEALTH_OK
Nov 29 01:19:24 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:24 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Nov 29 01:19:24 np0005539510 ceph-mon[77142]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Nov 29 01:19:24 np0005539510 ceph-mon[77142]: paxos.1).electionLogic(10) init, last seen epoch 10
Nov 29 01:19:24 np0005539510 ceph-mon[77142]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:19:24 np0005539510 podman[77322]: 2025-11-29 06:19:24.658891393 +0000 UTC m=+0.036760066 container create cc2130569eac4719b0aa372133a75148ec48d5025c28654a28b12d72cde17539 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_euler, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:19:24 np0005539510 systemd[1]: Started libpod-conmon-cc2130569eac4719b0aa372133a75148ec48d5025c28654a28b12d72cde17539.scope.
Nov 29 01:19:24 np0005539510 systemd[1]: Started libcrun container.
Nov 29 01:19:24 np0005539510 podman[77322]: 2025-11-29 06:19:24.733429501 +0000 UTC m=+0.111298204 container init cc2130569eac4719b0aa372133a75148ec48d5025c28654a28b12d72cde17539 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_euler, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Nov 29 01:19:24 np0005539510 podman[77322]: 2025-11-29 06:19:24.642750854 +0000 UTC m=+0.020619537 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:19:24 np0005539510 podman[77322]: 2025-11-29 06:19:24.741222414 +0000 UTC m=+0.119091087 container start cc2130569eac4719b0aa372133a75148ec48d5025c28654a28b12d72cde17539 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_euler, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 01:19:24 np0005539510 podman[77322]: 2025-11-29 06:19:24.744338095 +0000 UTC m=+0.122206768 container attach cc2130569eac4719b0aa372133a75148ec48d5025c28654a28b12d72cde17539 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_euler, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 01:19:24 np0005539510 relaxed_euler[77338]: 167 167
Nov 29 01:19:24 np0005539510 systemd[1]: libpod-cc2130569eac4719b0aa372133a75148ec48d5025c28654a28b12d72cde17539.scope: Deactivated successfully.
Nov 29 01:19:24 np0005539510 podman[77322]: 2025-11-29 06:19:24.747170268 +0000 UTC m=+0.125038951 container died cc2130569eac4719b0aa372133a75148ec48d5025c28654a28b12d72cde17539 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_euler, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Nov 29 01:19:24 np0005539510 systemd[1]: var-lib-containers-storage-overlay-03093a32d0d5dfba5098240b29cd21f28a2982381d0308b4235c14c7d57fa0de-merged.mount: Deactivated successfully.
Nov 29 01:19:24 np0005539510 podman[77322]: 2025-11-29 06:19:24.780068164 +0000 UTC m=+0.157936867 container remove cc2130569eac4719b0aa372133a75148ec48d5025c28654a28b12d72cde17539 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=relaxed_euler, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Nov 29 01:19:24 np0005539510 systemd[1]: libpod-conmon-cc2130569eac4719b0aa372133a75148ec48d5025c28654a28b12d72cde17539.scope: Deactivated successfully.
Nov 29 01:19:24 np0005539510 systemd[1]: Reloading.
Nov 29 01:19:24 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:19:24 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:19:25 np0005539510 systemd[1]: Reloading.
Nov 29 01:19:25 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:19:25 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:19:25 np0005539510 systemd[1]: Starting Ceph mgr.compute-2.ngsyhe for 336ec58c-893b-528f-a0c1-6ed1196bc047...
Nov 29 01:19:25 np0005539510 podman[77485]: 2025-11-29 06:19:25.483264035 +0000 UTC m=+0.019501458 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:19:25 np0005539510 podman[77485]: 2025-11-29 06:19:25.980363908 +0000 UTC m=+0.516601301 container create 08bcce8f2a322c8ab979069b9ba321569afcdd4bcb6f299dc6807bd13b238413 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507)
Nov 29 01:19:26 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d784d0f3a431ea69c785c946dd6a3c92bae52140bd72bec2679550664a59bd97/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:26 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d784d0f3a431ea69c785c946dd6a3c92bae52140bd72bec2679550664a59bd97/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:26 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d784d0f3a431ea69c785c946dd6a3c92bae52140bd72bec2679550664a59bd97/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:26 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d784d0f3a431ea69c785c946dd6a3c92bae52140bd72bec2679550664a59bd97/merged/var/lib/ceph/mgr/ceph-compute-2.ngsyhe supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:26 np0005539510 podman[77485]: 2025-11-29 06:19:26.096892717 +0000 UTC m=+0.633130130 container init 08bcce8f2a322c8ab979069b9ba321569afcdd4bcb6f299dc6807bd13b238413 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 01:19:26 np0005539510 podman[77485]: 2025-11-29 06:19:26.102288637 +0000 UTC m=+0.638526030 container start 08bcce8f2a322c8ab979069b9ba321569afcdd4bcb6f299dc6807bd13b238413 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 01:19:26 np0005539510 bash[77485]: 08bcce8f2a322c8ab979069b9ba321569afcdd4bcb6f299dc6807bd13b238413
Nov 29 01:19:26 np0005539510 systemd[1]: Started Ceph mgr.compute-2.ngsyhe for 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 01:19:26 np0005539510 ceph-mon[77142]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Nov 29 01:19:26 np0005539510 ceph-mon[77142]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Nov 29 01:19:26 np0005539510 ceph-mon[77142]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Nov 29 01:19:27 np0005539510 ceph-mon[77142]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Nov 29 01:19:29 np0005539510 ceph-mon[77142]: paxos.1).electionLogic(11) init, last seen epoch 11, mid-election, bumping
Nov 29 01:19:29 np0005539510 ceph-mon[77142]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Nov 29 01:19:29 np0005539510 ceph-mon[77142]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:19:29 np0005539510 ceph-mon[77142]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:19:29 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:19:29 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Nov 29 01:19:29 np0005539510 ceph-mgr[77504]: set uid:gid to 167:167 (ceph:ceph)
Nov 29 01:19:29 np0005539510 ceph-mgr[77504]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mgr, pid 2
Nov 29 01:19:29 np0005539510 ceph-mgr[77504]: pidfile_write: ignore empty --pid-file
Nov 29 01:19:29 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Nov 29 01:19:29 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'alerts'
Nov 29 01:19:29 np0005539510 ceph-mon[77142]: Deploying daemon mgr.compute-2.ngsyhe on compute-2
Nov 29 01:19:29 np0005539510 ceph-mon[77142]: mon.compute-0 calling monitor election
Nov 29 01:19:29 np0005539510 ceph-mon[77142]: mon.compute-2 calling monitor election
Nov 29 01:19:29 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/501439537' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Nov 29 01:19:29 np0005539510 ceph-mon[77142]: mon.compute-1 calling monitor election
Nov 29 01:19:29 np0005539510 ceph-mon[77142]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Nov 29 01:19:29 np0005539510 ceph-mon[77142]: overall HEALTH_OK
Nov 29 01:19:29 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/501439537' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Nov 29 01:19:29 np0005539510 ceph-mgr[77504]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Nov 29 01:19:29 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'balancer'
Nov 29 01:19:29 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:29.813+0000 7f62940bd140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Nov 29 01:19:30 np0005539510 ceph-mgr[77504]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Nov 29 01:19:30 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'cephadm'
Nov 29 01:19:30 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:30.090+0000 7f62940bd140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Nov 29 01:19:30 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:30 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:30 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:30 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:30 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.gaxpay", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Nov 29 01:19:30 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.gaxpay", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Nov 29 01:19:30 np0005539510 ceph-mon[77142]: Deploying daemon mgr.compute-1.gaxpay on compute-1
Nov 29 01:19:31 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/2714267067' entity='client.admin' 
Nov 29 01:19:31 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Nov 29 01:19:31 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Nov 29 01:19:32 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'crash'
Nov 29 01:19:32 np0005539510 ceph-mgr[77504]: mgr[py] Module crash has missing NOTIFY_TYPES member
Nov 29 01:19:32 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'dashboard'
Nov 29 01:19:32 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:32.385+0000 7f62940bd140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Nov 29 01:19:32 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:32 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:32 np0005539510 ceph-mon[77142]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Nov 29 01:19:32 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:32 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:32 np0005539510 ceph-mon[77142]: Saving service ingress.rgw.default spec with placement count:2
Nov 29 01:19:32 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:32 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Nov 29 01:19:32 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:32 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Nov 29 01:19:32 np0005539510 ceph-mon[77142]: Deploying daemon crash.compute-2 on compute-2
Nov 29 01:19:33 np0005539510 podman[77681]: 2025-11-29 06:19:33.185323283 +0000 UTC m=+0.045268927 container create e4128caa21807f2bbfcdcdb18010b58c359eec2191cf9659ea5a110dad6ac8a5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_yonath, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 01:19:33 np0005539510 systemd[1]: Started libpod-conmon-e4128caa21807f2bbfcdcdb18010b58c359eec2191cf9659ea5a110dad6ac8a5.scope.
Nov 29 01:19:33 np0005539510 systemd[1]: Started libcrun container.
Nov 29 01:19:33 np0005539510 podman[77681]: 2025-11-29 06:19:33.167924978 +0000 UTC m=+0.027870642 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:19:33 np0005539510 podman[77681]: 2025-11-29 06:19:33.270554866 +0000 UTC m=+0.130500560 container init e4128caa21807f2bbfcdcdb18010b58c359eec2191cf9659ea5a110dad6ac8a5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_yonath, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 01:19:33 np0005539510 podman[77681]: 2025-11-29 06:19:33.277289903 +0000 UTC m=+0.137235567 container start e4128caa21807f2bbfcdcdb18010b58c359eec2191cf9659ea5a110dad6ac8a5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_yonath, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507)
Nov 29 01:19:33 np0005539510 podman[77681]: 2025-11-29 06:19:33.281987426 +0000 UTC m=+0.141933070 container attach e4128caa21807f2bbfcdcdb18010b58c359eec2191cf9659ea5a110dad6ac8a5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_yonath, ceph=True, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Nov 29 01:19:33 np0005539510 nervous_yonath[77697]: 167 167
Nov 29 01:19:33 np0005539510 systemd[1]: libpod-e4128caa21807f2bbfcdcdb18010b58c359eec2191cf9659ea5a110dad6ac8a5.scope: Deactivated successfully.
Nov 29 01:19:33 np0005539510 conmon[77697]: conmon e4128caa21807f2bbfcd <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-e4128caa21807f2bbfcdcdb18010b58c359eec2191cf9659ea5a110dad6ac8a5.scope/container/memory.events
Nov 29 01:19:33 np0005539510 podman[77681]: 2025-11-29 06:19:33.288480316 +0000 UTC m=+0.148425970 container died e4128caa21807f2bbfcdcdb18010b58c359eec2191cf9659ea5a110dad6ac8a5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_yonath, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 01:19:33 np0005539510 systemd[1]: var-lib-containers-storage-overlay-3eb33ebd42ed44668431591fb691b15dc01d80f896bb93ac2acd6f5b4cdc7f19-merged.mount: Deactivated successfully.
Nov 29 01:19:33 np0005539510 podman[77681]: 2025-11-29 06:19:33.336018682 +0000 UTC m=+0.195964326 container remove e4128caa21807f2bbfcdcdb18010b58c359eec2191cf9659ea5a110dad6ac8a5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nervous_yonath, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 01:19:33 np0005539510 systemd[1]: libpod-conmon-e4128caa21807f2bbfcdcdb18010b58c359eec2191cf9659ea5a110dad6ac8a5.scope: Deactivated successfully.
Nov 29 01:19:33 np0005539510 systemd[1]: Reloading.
Nov 29 01:19:33 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:19:33 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:19:33 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e29 _set_new_cache_sizes cache_size:1019926139 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:19:33 np0005539510 systemd[1]: Reloading.
Nov 29 01:19:33 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:19:33 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:19:33 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'devicehealth'
Nov 29 01:19:33 np0005539510 systemd[1]: Starting Ceph crash.compute-2 for 336ec58c-893b-528f-a0c1-6ed1196bc047...
Nov 29 01:19:34 np0005539510 podman[77843]: 2025-11-29 06:19:34.118001388 +0000 UTC m=+0.032777780 container create 0ad5ea54c4d3a884204483ac831e854807deecb353611aa286eddde0eac40b49 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.schema-version=1.0)
Nov 29 01:19:34 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:34.142+0000 7f62940bd140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Nov 29 01:19:34 np0005539510 ceph-mgr[77504]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Nov 29 01:19:34 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'diskprediction_local'
Nov 29 01:19:34 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8fd6eb729b8ca574b1224f3cec7bbcb055f3a129aec949dc83fcb6c3ad1a8ab/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:34 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8fd6eb729b8ca574b1224f3cec7bbcb055f3a129aec949dc83fcb6c3ad1a8ab/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:34 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8fd6eb729b8ca574b1224f3cec7bbcb055f3a129aec949dc83fcb6c3ad1a8ab/merged/etc/ceph/ceph.client.crash.compute-2.keyring supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:34 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8fd6eb729b8ca574b1224f3cec7bbcb055f3a129aec949dc83fcb6c3ad1a8ab/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:34 np0005539510 podman[77843]: 2025-11-29 06:19:34.163370237 +0000 UTC m=+0.078146629 container init 0ad5ea54c4d3a884204483ac831e854807deecb353611aa286eddde0eac40b49 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Nov 29 01:19:34 np0005539510 podman[77843]: 2025-11-29 06:19:34.170191765 +0000 UTC m=+0.084968157 container start 0ad5ea54c4d3a884204483ac831e854807deecb353611aa286eddde0eac40b49 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-2, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_REF=reef, io.buildah.version=1.39.3)
Nov 29 01:19:34 np0005539510 bash[77843]: 0ad5ea54c4d3a884204483ac831e854807deecb353611aa286eddde0eac40b49
Nov 29 01:19:34 np0005539510 podman[77843]: 2025-11-29 06:19:34.104027672 +0000 UTC m=+0.018804084 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:19:34 np0005539510 systemd[1]: Started Ceph crash.compute-2 for 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 01:19:34 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-2[77859]: INFO:ceph-crash:pinging cluster to exercise our key
Nov 29 01:19:34 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-2[77859]: 2025-11-29T06:19:34.548+0000 7f718b01f640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Nov 29 01:19:34 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-2[77859]: 2025-11-29T06:19:34.548+0000 7f718b01f640 -1 AuthRegistry(0x7f71840675b0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Nov 29 01:19:34 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-2[77859]: 2025-11-29T06:19:34.549+0000 7f718b01f640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Nov 29 01:19:34 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-2[77859]: 2025-11-29T06:19:34.549+0000 7f718b01f640 -1 AuthRegistry(0x7f718b01e000) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Nov 29 01:19:34 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-2[77859]: 2025-11-29T06:19:34.550+0000 7f7189595640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Nov 29 01:19:34 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-2[77859]: 2025-11-29T06:19:34.552+0000 7f7188d94640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Nov 29 01:19:34 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-2[77859]: 2025-11-29T06:19:34.553+0000 7f7183fff640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Nov 29 01:19:34 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-2[77859]: 2025-11-29T06:19:34.554+0000 7f718b01f640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Nov 29 01:19:34 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-2[77859]: [errno 13] RADOS permission denied (error connecting to the cluster)
Nov 29 01:19:34 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-crash-compute-2[77859]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Nov 29 01:19:34 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Nov 29 01:19:34 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Nov 29 01:19:34 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]:  from numpy import show_config as show_numpy_config
Nov 29 01:19:34 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:34.690+0000 7f62940bd140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Nov 29 01:19:34 np0005539510 ceph-mgr[77504]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Nov 29 01:19:34 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'influx'
Nov 29 01:19:34 np0005539510 podman[78013]: 2025-11-29 06:19:34.866296253 +0000 UTC m=+0.046958512 container create 30e452c897634541f2681d330b6030b01982af513d2f3c28ddfb49e24a3ce1ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_cohen, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 01:19:34 np0005539510 systemd[1]: Started libpod-conmon-30e452c897634541f2681d330b6030b01982af513d2f3c28ddfb49e24a3ce1ca.scope.
Nov 29 01:19:34 np0005539510 systemd[1]: Started libcrun container.
Nov 29 01:19:34 np0005539510 ceph-mgr[77504]: mgr[py] Module influx has missing NOTIFY_TYPES member
Nov 29 01:19:34 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:34.935+0000 7f62940bd140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Nov 29 01:19:34 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'insights'
Nov 29 01:19:34 np0005539510 podman[78013]: 2025-11-29 06:19:34.839911281 +0000 UTC m=+0.020573560 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:19:34 np0005539510 podman[78013]: 2025-11-29 06:19:34.948872326 +0000 UTC m=+0.129534605 container init 30e452c897634541f2681d330b6030b01982af513d2f3c28ddfb49e24a3ce1ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_cohen, org.label-schema.build-date=20250507, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 01:19:34 np0005539510 podman[78013]: 2025-11-29 06:19:34.955576002 +0000 UTC m=+0.136238271 container start 30e452c897634541f2681d330b6030b01982af513d2f3c28ddfb49e24a3ce1ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_cohen, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Nov 29 01:19:34 np0005539510 podman[78013]: 2025-11-29 06:19:34.95931339 +0000 UTC m=+0.139975679 container attach 30e452c897634541f2681d330b6030b01982af513d2f3c28ddfb49e24a3ce1ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_cohen, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Nov 29 01:19:34 np0005539510 nifty_cohen[78030]: 167 167
Nov 29 01:19:34 np0005539510 systemd[1]: libpod-30e452c897634541f2681d330b6030b01982af513d2f3c28ddfb49e24a3ce1ca.scope: Deactivated successfully.
Nov 29 01:19:34 np0005539510 podman[78013]: 2025-11-29 06:19:34.961821815 +0000 UTC m=+0.142484074 container died 30e452c897634541f2681d330b6030b01982af513d2f3c28ddfb49e24a3ce1ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_cohen, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Nov 29 01:19:34 np0005539510 systemd[1]: var-lib-containers-storage-overlay-728cd6d1d812d51298ea65e05e41acbdd2baecb9eb9c560c79580fd2e99f8bd2-merged.mount: Deactivated successfully.
Nov 29 01:19:34 np0005539510 podman[78013]: 2025-11-29 06:19:34.996434872 +0000 UTC m=+0.177097131 container remove 30e452c897634541f2681d330b6030b01982af513d2f3c28ddfb49e24a3ce1ca (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nifty_cohen, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Nov 29 01:19:35 np0005539510 systemd[1]: libpod-conmon-30e452c897634541f2681d330b6030b01982af513d2f3c28ddfb49e24a3ce1ca.scope: Deactivated successfully.
Nov 29 01:19:35 np0005539510 podman[78055]: 2025-11-29 06:19:35.150674503 +0000 UTC m=+0.044664501 container create 9f4a78fd5d3af1183bad6dc53c8a6e2483f3f7d1bd6a2b19aef1545898a08ec0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=brave_noether, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Nov 29 01:19:35 np0005539510 systemd[1]: Started libpod-conmon-9f4a78fd5d3af1183bad6dc53c8a6e2483f3f7d1bd6a2b19aef1545898a08ec0.scope.
Nov 29 01:19:35 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'iostat'
Nov 29 01:19:35 np0005539510 systemd[1]: Started libcrun container.
Nov 29 01:19:35 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0adc539a7718d5bda0502cc37410003fa5f4aa75667649c6bc283651fd00cbe/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:35 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0adc539a7718d5bda0502cc37410003fa5f4aa75667649c6bc283651fd00cbe/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:35 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0adc539a7718d5bda0502cc37410003fa5f4aa75667649c6bc283651fd00cbe/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:35 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0adc539a7718d5bda0502cc37410003fa5f4aa75667649c6bc283651fd00cbe/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:35 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0adc539a7718d5bda0502cc37410003fa5f4aa75667649c6bc283651fd00cbe/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:35 np0005539510 podman[78055]: 2025-11-29 06:19:35.133252257 +0000 UTC m=+0.027242255 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:19:35 np0005539510 podman[78055]: 2025-11-29 06:19:35.235399773 +0000 UTC m=+0.129389781 container init 9f4a78fd5d3af1183bad6dc53c8a6e2483f3f7d1bd6a2b19aef1545898a08ec0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=brave_noether, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 01:19:35 np0005539510 podman[78055]: 2025-11-29 06:19:35.241139233 +0000 UTC m=+0.135129231 container start 9f4a78fd5d3af1183bad6dc53c8a6e2483f3f7d1bd6a2b19aef1545898a08ec0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=brave_noether, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 01:19:35 np0005539510 podman[78055]: 2025-11-29 06:19:35.244979004 +0000 UTC m=+0.138969052 container attach 9f4a78fd5d3af1183bad6dc53c8a6e2483f3f7d1bd6a2b19aef1545898a08ec0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=brave_noether, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.39.3)
Nov 29 01:19:35 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:35 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:35 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:35 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:35 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:19:35 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:19:35 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:19:35 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:35 np0005539510 ceph-mgr[77504]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Nov 29 01:19:35 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:35.436+0000 7f62940bd140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Nov 29 01:19:35 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'k8sevents'
Nov 29 01:19:35 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).mds e2 new map
Nov 29 01:19:35 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).mds e2 print_map#012e2#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-29T06:19:35.588785+0000#012modified#0112025-11-29T06:19:35.589013+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012 #012 
Nov 29 01:19:35 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e30 e30: 2 total, 2 up, 2 in
Nov 29 01:19:36 np0005539510 brave_noether[78071]: --> passed data devices: 0 physical, 1 LVM
Nov 29 01:19:36 np0005539510 brave_noether[78071]: --> relative data size: 1.0
Nov 29 01:19:36 np0005539510 brave_noether[78071]: Running command: /usr/bin/ceph-authtool --gen-print-key
Nov 29 01:19:36 np0005539510 brave_noether[78071]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new f86a06f9-a09f-46de-8440-929a842d2c66
Nov 29 01:19:36 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Nov 29 01:19:36 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Nov 29 01:19:36 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Nov 29 01:19:36 np0005539510 ceph-mon[77142]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Nov 29 01:19:36 np0005539510 ceph-mon[77142]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Nov 29 01:19:36 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Nov 29 01:19:36 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:36 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd new", "uuid": "f86a06f9-a09f-46de-8440-929a842d2c66"} v 0) v1
Nov 29 01:19:36 np0005539510 ceph-mon[77142]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/2624547066' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "f86a06f9-a09f-46de-8440-929a842d2c66"}]: dispatch
Nov 29 01:19:36 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e31 e31: 3 total, 2 up, 3 in
Nov 29 01:19:36 np0005539510 brave_noether[78071]: Running command: /usr/bin/ceph-authtool --gen-print-key
Nov 29 01:19:36 np0005539510 brave_noether[78071]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-2
Nov 29 01:19:36 np0005539510 lvm[78119]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 29 01:19:36 np0005539510 lvm[78119]: VG ceph_vg0 finished
Nov 29 01:19:36 np0005539510 brave_noether[78071]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Nov 29 01:19:36 np0005539510 brave_noether[78071]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 29 01:19:36 np0005539510 brave_noether[78071]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Nov 29 01:19:36 np0005539510 brave_noether[78071]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-2/activate.monmap
Nov 29 01:19:36 np0005539510 systemd[72593]: Starting Mark boot as successful...
Nov 29 01:19:36 np0005539510 systemd[72593]: Finished Mark boot as successful.
Nov 29 01:19:37 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon getmap"} v 0) v1
Nov 29 01:19:37 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2894938433' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Nov 29 01:19:37 np0005539510 brave_noether[78071]: stderr: got monmap epoch 3
Nov 29 01:19:37 np0005539510 brave_noether[78071]: --> Creating keyring file for osd.2
Nov 29 01:19:37 np0005539510 brave_noether[78071]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/keyring
Nov 29 01:19:37 np0005539510 brave_noether[78071]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/
Nov 29 01:19:37 np0005539510 brave_noether[78071]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 2 --monmap /var/lib/ceph/osd/ceph-2/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-2/ --osd-uuid f86a06f9-a09f-46de-8440-929a842d2c66 --setuser ceph --setgroup ceph
Nov 29 01:19:37 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'localpool'
Nov 29 01:19:37 np0005539510 ceph-mon[77142]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Nov 29 01:19:37 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.102:0/2624547066' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "f86a06f9-a09f-46de-8440-929a842d2c66"}]: dispatch
Nov 29 01:19:37 np0005539510 ceph-mon[77142]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "f86a06f9-a09f-46de-8440-929a842d2c66"}]: dispatch
Nov 29 01:19:37 np0005539510 ceph-mon[77142]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "f86a06f9-a09f-46de-8440-929a842d2c66"}]': finished
Nov 29 01:19:37 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:37 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'mds_autoscaler'
Nov 29 01:19:38 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'mirroring'
Nov 29 01:19:38 np0005539510 ceph-mon[77142]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Nov 29 01:19:38 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'nfs'
Nov 29 01:19:38 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e31 _set_new_cache_sizes cache_size:1020053029 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:19:39 np0005539510 ceph-mgr[77504]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Nov 29 01:19:39 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'orchestrator'
Nov 29 01:19:39 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:39.409+0000 7f62940bd140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Nov 29 01:19:39 np0005539510 brave_noether[78071]: stderr: 2025-11-29T06:19:37.226+0000 7f1e2dea4740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Nov 29 01:19:39 np0005539510 brave_noether[78071]: stderr: 2025-11-29T06:19:37.226+0000 7f1e2dea4740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Nov 29 01:19:39 np0005539510 brave_noether[78071]: stderr: 2025-11-29T06:19:37.226+0000 7f1e2dea4740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Nov 29 01:19:39 np0005539510 brave_noether[78071]: stderr: 2025-11-29T06:19:37.226+0000 7f1e2dea4740 -1 bluestore(/var/lib/ceph/osd/ceph-2/) _read_fsid unparsable uuid
Nov 29 01:19:39 np0005539510 brave_noether[78071]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Nov 29 01:19:39 np0005539510 brave_noether[78071]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Nov 29 01:19:39 np0005539510 brave_noether[78071]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Nov 29 01:19:39 np0005539510 brave_noether[78071]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Nov 29 01:19:39 np0005539510 brave_noether[78071]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Nov 29 01:19:39 np0005539510 brave_noether[78071]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 29 01:19:39 np0005539510 brave_noether[78071]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Nov 29 01:19:39 np0005539510 brave_noether[78071]: --> ceph-volume lvm activate successful for osd ID: 2
Nov 29 01:19:39 np0005539510 brave_noether[78071]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Nov 29 01:19:39 np0005539510 systemd[1]: libpod-9f4a78fd5d3af1183bad6dc53c8a6e2483f3f7d1bd6a2b19aef1545898a08ec0.scope: Deactivated successfully.
Nov 29 01:19:39 np0005539510 systemd[1]: libpod-9f4a78fd5d3af1183bad6dc53c8a6e2483f3f7d1bd6a2b19aef1545898a08ec0.scope: Consumed 2.524s CPU time.
Nov 29 01:19:39 np0005539510 podman[78055]: 2025-11-29 06:19:39.609344326 +0000 UTC m=+4.503334324 container died 9f4a78fd5d3af1183bad6dc53c8a6e2483f3f7d1bd6a2b19aef1545898a08ec0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=brave_noether, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Nov 29 01:19:40 np0005539510 ceph-mgr[77504]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Nov 29 01:19:40 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'osd_perf_query'
Nov 29 01:19:40 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:40.113+0000 7f62940bd140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Nov 29 01:19:40 np0005539510 ceph-mgr[77504]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Nov 29 01:19:40 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'osd_support'
Nov 29 01:19:40 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:40.392+0000 7f62940bd140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Nov 29 01:19:40 np0005539510 systemd[1]: var-lib-containers-storage-overlay-b0adc539a7718d5bda0502cc37410003fa5f4aa75667649c6bc283651fd00cbe-merged.mount: Deactivated successfully.
Nov 29 01:19:40 np0005539510 podman[78055]: 2025-11-29 06:19:40.638647182 +0000 UTC m=+5.532637180 container remove 9f4a78fd5d3af1183bad6dc53c8a6e2483f3f7d1bd6a2b19aef1545898a08ec0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=brave_noether, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 01:19:40 np0005539510 ceph-mgr[77504]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Nov 29 01:19:40 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'pg_autoscaler'
Nov 29 01:19:40 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:40.652+0000 7f62940bd140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Nov 29 01:19:40 np0005539510 systemd[1]: libpod-conmon-9f4a78fd5d3af1183bad6dc53c8a6e2483f3f7d1bd6a2b19aef1545898a08ec0.scope: Deactivated successfully.
Nov 29 01:19:40 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/713391435' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Nov 29 01:19:40 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/713391435' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Nov 29 01:19:40 np0005539510 ceph-mgr[77504]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Nov 29 01:19:40 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:40.962+0000 7f62940bd140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Nov 29 01:19:40 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'progress'
Nov 29 01:19:41 np0005539510 podman[79176]: 2025-11-29 06:19:41.224181662 +0000 UTC m=+0.039725832 container create 904ad1fa6640083e0f8ad249224ec63f25604f6926857edae2c1593b46c22aeb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_goldberg, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=reef)
Nov 29 01:19:41 np0005539510 ceph-mgr[77504]: mgr[py] Module progress has missing NOTIFY_TYPES member
Nov 29 01:19:41 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'prometheus'
Nov 29 01:19:41 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:41.229+0000 7f62940bd140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Nov 29 01:19:41 np0005539510 systemd[1]: Started libpod-conmon-904ad1fa6640083e0f8ad249224ec63f25604f6926857edae2c1593b46c22aeb.scope.
Nov 29 01:19:41 np0005539510 systemd[1]: Started libcrun container.
Nov 29 01:19:41 np0005539510 podman[79176]: 2025-11-29 06:19:41.293617531 +0000 UTC m=+0.109161721 container init 904ad1fa6640083e0f8ad249224ec63f25604f6926857edae2c1593b46c22aeb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_goldberg, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0)
Nov 29 01:19:41 np0005539510 podman[79176]: 2025-11-29 06:19:41.300226054 +0000 UTC m=+0.115770224 container start 904ad1fa6640083e0f8ad249224ec63f25604f6926857edae2c1593b46c22aeb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_goldberg, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 01:19:41 np0005539510 podman[79176]: 2025-11-29 06:19:41.20655071 +0000 UTC m=+0.022094880 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:19:41 np0005539510 podman[79176]: 2025-11-29 06:19:41.30503686 +0000 UTC m=+0.120581030 container attach 904ad1fa6640083e0f8ad249224ec63f25604f6926857edae2c1593b46c22aeb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_goldberg, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Nov 29 01:19:41 np0005539510 gracious_goldberg[79192]: 167 167
Nov 29 01:19:41 np0005539510 systemd[1]: libpod-904ad1fa6640083e0f8ad249224ec63f25604f6926857edae2c1593b46c22aeb.scope: Deactivated successfully.
Nov 29 01:19:41 np0005539510 podman[79176]: 2025-11-29 06:19:41.307258138 +0000 UTC m=+0.122802308 container died 904ad1fa6640083e0f8ad249224ec63f25604f6926857edae2c1593b46c22aeb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_goldberg, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 01:19:41 np0005539510 systemd[1]: var-lib-containers-storage-overlay-90eca51c6f16e2253e2894b7c47d6e8603267d9c96cd5d31e117e9877448823b-merged.mount: Deactivated successfully.
Nov 29 01:19:41 np0005539510 podman[79176]: 2025-11-29 06:19:41.344046062 +0000 UTC m=+0.159590242 container remove 904ad1fa6640083e0f8ad249224ec63f25604f6926857edae2c1593b46c22aeb (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gracious_goldberg, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 01:19:41 np0005539510 systemd[1]: libpod-conmon-904ad1fa6640083e0f8ad249224ec63f25604f6926857edae2c1593b46c22aeb.scope: Deactivated successfully.
Nov 29 01:19:41 np0005539510 podman[79218]: 2025-11-29 06:19:41.494510794 +0000 UTC m=+0.040714187 container create 5c96d204cc534f9aa0439ede96e87175ce95e5d6d12874c7f8fad32323dd2376 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_driscoll, io.buildah.version=1.39.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Nov 29 01:19:41 np0005539510 systemd[1]: Started libpod-conmon-5c96d204cc534f9aa0439ede96e87175ce95e5d6d12874c7f8fad32323dd2376.scope.
Nov 29 01:19:41 np0005539510 systemd[1]: Started libcrun container.
Nov 29 01:19:41 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6bab3e35b38564394a11ecbdcd18292fa5bceef51a71a9aa6c248d23bbee2a9e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:41 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6bab3e35b38564394a11ecbdcd18292fa5bceef51a71a9aa6c248d23bbee2a9e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:41 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6bab3e35b38564394a11ecbdcd18292fa5bceef51a71a9aa6c248d23bbee2a9e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:41 np0005539510 podman[79218]: 2025-11-29 06:19:41.47715413 +0000 UTC m=+0.023357533 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:19:41 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6bab3e35b38564394a11ecbdcd18292fa5bceef51a71a9aa6c248d23bbee2a9e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:41 np0005539510 podman[79218]: 2025-11-29 06:19:41.582613853 +0000 UTC m=+0.128817256 container init 5c96d204cc534f9aa0439ede96e87175ce95e5d6d12874c7f8fad32323dd2376 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_driscoll, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3)
Nov 29 01:19:41 np0005539510 podman[79218]: 2025-11-29 06:19:41.590794177 +0000 UTC m=+0.136997580 container start 5c96d204cc534f9aa0439ede96e87175ce95e5d6d12874c7f8fad32323dd2376 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_driscoll, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 29 01:19:41 np0005539510 podman[79218]: 2025-11-29 06:19:41.595178982 +0000 UTC m=+0.141382385 container attach 5c96d204cc534f9aa0439ede96e87175ce95e5d6d12874c7f8fad32323dd2376 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_driscoll, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0)
Nov 29 01:19:42 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:42 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:42 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:42 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:42 np0005539510 ceph-mgr[77504]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Nov 29 01:19:42 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'rbd_support'
Nov 29 01:19:42 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:42.340+0000 7f62940bd140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]: {
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]:    "2": [
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]:        {
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]:            "devices": [
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]:                "/dev/loop3"
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]:            ],
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]:            "lv_name": "ceph_lv0",
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]:            "lv_size": "7511998464",
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=QZmYMa-OSGs-30so-3STC-BZF6-ZfIW-V0Wtxa,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=336ec58c-893b-528f-a0c1-6ed1196bc047,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=f86a06f9-a09f-46de-8440-929a842d2c66,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]:            "lv_uuid": "QZmYMa-OSGs-30so-3STC-BZF6-ZfIW-V0Wtxa",
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]:            "name": "ceph_lv0",
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]:            "path": "/dev/ceph_vg0/ceph_lv0",
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]:            "tags": {
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]:                "ceph.block_uuid": "QZmYMa-OSGs-30so-3STC-BZF6-ZfIW-V0Wtxa",
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]:                "ceph.cephx_lockbox_secret": "",
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]:                "ceph.cluster_fsid": "336ec58c-893b-528f-a0c1-6ed1196bc047",
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]:                "ceph.cluster_name": "ceph",
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]:                "ceph.crush_device_class": "",
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]:                "ceph.encrypted": "0",
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]:                "ceph.osd_fsid": "f86a06f9-a09f-46de-8440-929a842d2c66",
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]:                "ceph.osd_id": "2",
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]:                "ceph.osdspec_affinity": "default_drive_group",
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]:                "ceph.type": "block",
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]:                "ceph.vdo": "0"
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]:            },
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]:            "type": "block",
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]:            "vg_name": "ceph_vg0"
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]:        }
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]:    ]
Nov 29 01:19:42 np0005539510 boring_driscoll[79234]: }
Nov 29 01:19:42 np0005539510 systemd[1]: libpod-5c96d204cc534f9aa0439ede96e87175ce95e5d6d12874c7f8fad32323dd2376.scope: Deactivated successfully.
Nov 29 01:19:42 np0005539510 podman[79218]: 2025-11-29 06:19:42.417103135 +0000 UTC m=+0.963306518 container died 5c96d204cc534f9aa0439ede96e87175ce95e5d6d12874c7f8fad32323dd2376 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_driscoll, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 01:19:42 np0005539510 systemd[1]: var-lib-containers-storage-overlay-6bab3e35b38564394a11ecbdcd18292fa5bceef51a71a9aa6c248d23bbee2a9e-merged.mount: Deactivated successfully.
Nov 29 01:19:42 np0005539510 podman[79218]: 2025-11-29 06:19:42.483535936 +0000 UTC m=+1.029739349 container remove 5c96d204cc534f9aa0439ede96e87175ce95e5d6d12874c7f8fad32323dd2376 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=boring_driscoll, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.39.3)
Nov 29 01:19:42 np0005539510 systemd[1]: libpod-conmon-5c96d204cc534f9aa0439ede96e87175ce95e5d6d12874c7f8fad32323dd2376.scope: Deactivated successfully.
Nov 29 01:19:42 np0005539510 ceph-mgr[77504]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Nov 29 01:19:42 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:42.665+0000 7f62940bd140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Nov 29 01:19:42 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'restful'
Nov 29 01:19:43 np0005539510 podman[79397]: 2025-11-29 06:19:43.088757892 +0000 UTC m=+0.034731591 container create c6adcc4e09037984a97a458ce4a12101b9e8b82ad320486bb022a9f3e21ad7af (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=peaceful_brattain, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 01:19:43 np0005539510 systemd[1]: Started libpod-conmon-c6adcc4e09037984a97a458ce4a12101b9e8b82ad320486bb022a9f3e21ad7af.scope.
Nov 29 01:19:43 np0005539510 systemd[1]: Started libcrun container.
Nov 29 01:19:43 np0005539510 podman[79397]: 2025-11-29 06:19:43.161450757 +0000 UTC m=+0.107424486 container init c6adcc4e09037984a97a458ce4a12101b9e8b82ad320486bb022a9f3e21ad7af (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=peaceful_brattain, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Nov 29 01:19:43 np0005539510 podman[79397]: 2025-11-29 06:19:43.167896616 +0000 UTC m=+0.113870315 container start c6adcc4e09037984a97a458ce4a12101b9e8b82ad320486bb022a9f3e21ad7af (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=peaceful_brattain, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:19:43 np0005539510 peaceful_brattain[79413]: 167 167
Nov 29 01:19:43 np0005539510 podman[79397]: 2025-11-29 06:19:43.07379885 +0000 UTC m=+0.019772569 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:19:43 np0005539510 systemd[1]: libpod-c6adcc4e09037984a97a458ce4a12101b9e8b82ad320486bb022a9f3e21ad7af.scope: Deactivated successfully.
Nov 29 01:19:43 np0005539510 podman[79397]: 2025-11-29 06:19:43.172685811 +0000 UTC m=+0.118659530 container attach c6adcc4e09037984a97a458ce4a12101b9e8b82ad320486bb022a9f3e21ad7af (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=peaceful_brattain, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 01:19:43 np0005539510 podman[79397]: 2025-11-29 06:19:43.173098022 +0000 UTC m=+0.119071711 container died c6adcc4e09037984a97a458ce4a12101b9e8b82ad320486bb022a9f3e21ad7af (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=peaceful_brattain, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef)
Nov 29 01:19:43 np0005539510 systemd[1]: var-lib-containers-storage-overlay-c7c928a5dec5061b4b572ce892ddd032b20eb698483498c0759ee8680949db63-merged.mount: Deactivated successfully.
Nov 29 01:19:43 np0005539510 podman[79397]: 2025-11-29 06:19:43.207933514 +0000 UTC m=+0.153907213 container remove c6adcc4e09037984a97a458ce4a12101b9e8b82ad320486bb022a9f3e21ad7af (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=peaceful_brattain, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507)
Nov 29 01:19:43 np0005539510 systemd[1]: libpod-conmon-c6adcc4e09037984a97a458ce4a12101b9e8b82ad320486bb022a9f3e21ad7af.scope: Deactivated successfully.
Nov 29 01:19:43 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Nov 29 01:19:43 np0005539510 ceph-mon[77142]: Deploying daemon osd.2 on compute-2
Nov 29 01:19:43 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'rgw'
Nov 29 01:19:43 np0005539510 podman[79445]: 2025-11-29 06:19:43.479088019 +0000 UTC m=+0.041846088 container create 7665363d404168121cc7136b62ea48b468abe861ab588caa97c4be5632405678 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate-test, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 01:19:43 np0005539510 systemd[1]: Started libpod-conmon-7665363d404168121cc7136b62ea48b468abe861ab588caa97c4be5632405678.scope.
Nov 29 01:19:43 np0005539510 podman[79445]: 2025-11-29 06:19:43.46083122 +0000 UTC m=+0.023589299 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:19:43 np0005539510 systemd[1]: Started libcrun container.
Nov 29 01:19:43 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33c22b0e5608971f9523b0a01a59dcfd1b2717d0fb036f4fd04ed9059c63d330/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:43 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33c22b0e5608971f9523b0a01a59dcfd1b2717d0fb036f4fd04ed9059c63d330/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:43 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33c22b0e5608971f9523b0a01a59dcfd1b2717d0fb036f4fd04ed9059c63d330/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:43 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33c22b0e5608971f9523b0a01a59dcfd1b2717d0fb036f4fd04ed9059c63d330/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:43 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33c22b0e5608971f9523b0a01a59dcfd1b2717d0fb036f4fd04ed9059c63d330/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:43 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e31 _set_new_cache_sizes cache_size:1020054709 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:19:43 np0005539510 podman[79445]: 2025-11-29 06:19:43.68373358 +0000 UTC m=+0.246491659 container init 7665363d404168121cc7136b62ea48b468abe861ab588caa97c4be5632405678 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate-test, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Nov 29 01:19:43 np0005539510 podman[79445]: 2025-11-29 06:19:43.690376384 +0000 UTC m=+0.253134443 container start 7665363d404168121cc7136b62ea48b468abe861ab588caa97c4be5632405678 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate-test, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:19:43 np0005539510 podman[79445]: 2025-11-29 06:19:43.69784982 +0000 UTC m=+0.260607879 container attach 7665363d404168121cc7136b62ea48b468abe861ab588caa97c4be5632405678 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate-test, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 01:19:44 np0005539510 ceph-mgr[77504]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Nov 29 01:19:44 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'rook'
Nov 29 01:19:44 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:44.130+0000 7f62940bd140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Nov 29 01:19:44 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate-test[79461]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Nov 29 01:19:44 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate-test[79461]:                            [--no-systemd] [--no-tmpfs]
Nov 29 01:19:44 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate-test[79461]: ceph-volume activate: error: unrecognized arguments: --bad-option
Nov 29 01:19:44 np0005539510 systemd[1]: libpod-7665363d404168121cc7136b62ea48b468abe861ab588caa97c4be5632405678.scope: Deactivated successfully.
Nov 29 01:19:44 np0005539510 podman[79445]: 2025-11-29 06:19:44.428158323 +0000 UTC m=+0.990916392 container died 7665363d404168121cc7136b62ea48b468abe861ab588caa97c4be5632405678 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate-test, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 01:19:44 np0005539510 systemd[1]: var-lib-containers-storage-overlay-33c22b0e5608971f9523b0a01a59dcfd1b2717d0fb036f4fd04ed9059c63d330-merged.mount: Deactivated successfully.
Nov 29 01:19:44 np0005539510 podman[79445]: 2025-11-29 06:19:44.490566948 +0000 UTC m=+1.053325007 container remove 7665363d404168121cc7136b62ea48b468abe861ab588caa97c4be5632405678 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate-test, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2)
Nov 29 01:19:44 np0005539510 systemd[1]: libpod-conmon-7665363d404168121cc7136b62ea48b468abe861ab588caa97c4be5632405678.scope: Deactivated successfully.
Nov 29 01:19:44 np0005539510 systemd[1]: Reloading.
Nov 29 01:19:44 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:19:44 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:19:45 np0005539510 systemd[1]: Reloading.
Nov 29 01:19:45 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:19:45 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:19:45 np0005539510 systemd[1]: Starting Ceph osd.2 for 336ec58c-893b-528f-a0c1-6ed1196bc047...
Nov 29 01:19:45 np0005539510 podman[79626]: 2025-11-29 06:19:45.67467379 +0000 UTC m=+0.041728684 container create 90a85180f40cb2053ce972be4ece84f6b47c7ab86a4628ce178b95a40cd7d44a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Nov 29 01:19:45 np0005539510 systemd[1]: Started libcrun container.
Nov 29 01:19:45 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/587ff30fd6023781cb31c4a2c13dc569a785e38537b8fbd4236b5165712bdd68/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:45 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/587ff30fd6023781cb31c4a2c13dc569a785e38537b8fbd4236b5165712bdd68/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:45 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/587ff30fd6023781cb31c4a2c13dc569a785e38537b8fbd4236b5165712bdd68/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:45 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/587ff30fd6023781cb31c4a2c13dc569a785e38537b8fbd4236b5165712bdd68/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:45 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/587ff30fd6023781cb31c4a2c13dc569a785e38537b8fbd4236b5165712bdd68/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:45 np0005539510 podman[79626]: 2025-11-29 06:19:45.734700033 +0000 UTC m=+0.101754937 container init 90a85180f40cb2053ce972be4ece84f6b47c7ab86a4628ce178b95a40cd7d44a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 01:19:45 np0005539510 podman[79626]: 2025-11-29 06:19:45.744186551 +0000 UTC m=+0.111241445 container start 90a85180f40cb2053ce972be4ece84f6b47c7ab86a4628ce178b95a40cd7d44a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Nov 29 01:19:45 np0005539510 podman[79626]: 2025-11-29 06:19:45.74756483 +0000 UTC m=+0.114619724 container attach 90a85180f40cb2053ce972be4ece84f6b47c7ab86a4628ce178b95a40cd7d44a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2)
Nov 29 01:19:45 np0005539510 podman[79626]: 2025-11-29 06:19:45.656088423 +0000 UTC m=+0.023143337 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:19:46 np0005539510 ceph-mgr[77504]: mgr[py] Module rook has missing NOTIFY_TYPES member
Nov 29 01:19:46 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:46.372+0000 7f62940bd140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Nov 29 01:19:46 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'selftest'
Nov 29 01:19:46 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate[79642]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Nov 29 01:19:46 np0005539510 bash[79626]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Nov 29 01:19:46 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate[79642]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-2 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Nov 29 01:19:46 np0005539510 bash[79626]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-2 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Nov 29 01:19:46 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate[79642]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Nov 29 01:19:46 np0005539510 bash[79626]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Nov 29 01:19:46 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate[79642]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 29 01:19:46 np0005539510 bash[79626]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 29 01:19:46 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate[79642]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Nov 29 01:19:46 np0005539510 bash[79626]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Nov 29 01:19:46 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate[79642]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Nov 29 01:19:46 np0005539510 bash[79626]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Nov 29 01:19:46 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:46.651+0000 7f62940bd140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Nov 29 01:19:46 np0005539510 ceph-mgr[77504]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Nov 29 01:19:46 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'snap_schedule'
Nov 29 01:19:46 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate[79642]: --> ceph-volume raw activate successful for osd ID: 2
Nov 29 01:19:46 np0005539510 bash[79626]: --> ceph-volume raw activate successful for osd ID: 2
Nov 29 01:19:46 np0005539510 systemd[1]: libpod-90a85180f40cb2053ce972be4ece84f6b47c7ab86a4628ce178b95a40cd7d44a.scope: Deactivated successfully.
Nov 29 01:19:46 np0005539510 podman[79626]: 2025-11-29 06:19:46.691671285 +0000 UTC m=+1.058726169 container died 90a85180f40cb2053ce972be4ece84f6b47c7ab86a4628ce178b95a40cd7d44a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 01:19:46 np0005539510 systemd[1]: var-lib-containers-storage-overlay-587ff30fd6023781cb31c4a2c13dc569a785e38537b8fbd4236b5165712bdd68-merged.mount: Deactivated successfully.
Nov 29 01:19:46 np0005539510 podman[79626]: 2025-11-29 06:19:46.751516632 +0000 UTC m=+1.118571526 container remove 90a85180f40cb2053ce972be4ece84f6b47c7ab86a4628ce178b95a40cd7d44a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 01:19:46 np0005539510 ceph-mgr[77504]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Nov 29 01:19:46 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:46.913+0000 7f62940bd140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Nov 29 01:19:46 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'stats'
Nov 29 01:19:46 np0005539510 podman[79803]: 2025-11-29 06:19:46.940907214 +0000 UTC m=+0.041696553 container create 30804851543e8389a8073c09df360b14bc4f5c48fe90d3035f6911fdf735c892 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 01:19:46 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0edc42acad6bcc34eb236157944739c48cd471a327fc883f8ecca021512e5dce/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:46 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0edc42acad6bcc34eb236157944739c48cd471a327fc883f8ecca021512e5dce/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:46 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0edc42acad6bcc34eb236157944739c48cd471a327fc883f8ecca021512e5dce/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:47 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0edc42acad6bcc34eb236157944739c48cd471a327fc883f8ecca021512e5dce/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:47 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0edc42acad6bcc34eb236157944739c48cd471a327fc883f8ecca021512e5dce/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:47 np0005539510 podman[79803]: 2025-11-29 06:19:47.013566308 +0000 UTC m=+0.114355677 container init 30804851543e8389a8073c09df360b14bc4f5c48fe90d3035f6911fdf735c892 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2, CEPH_REF=reef, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:19:47 np0005539510 podman[79803]: 2025-11-29 06:19:47.020826038 +0000 UTC m=+0.121615387 container start 30804851543e8389a8073c09df360b14bc4f5c48fe90d3035f6911fdf735c892 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:19:47 np0005539510 podman[79803]: 2025-11-29 06:19:46.924658559 +0000 UTC m=+0.025447908 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:19:47 np0005539510 bash[79803]: 30804851543e8389a8073c09df360b14bc4f5c48fe90d3035f6911fdf735c892
Nov 29 01:19:47 np0005539510 systemd[1]: Started Ceph osd.2 for 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 01:19:47 np0005539510 ceph-osd[79822]: set uid:gid to 167:167 (ceph:ceph)
Nov 29 01:19:47 np0005539510 ceph-osd[79822]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-osd, pid 2
Nov 29 01:19:47 np0005539510 ceph-osd[79822]: pidfile_write: ignore empty --pid-file
Nov 29 01:19:47 np0005539510 ceph-osd[79822]: bdev(0x55fecfc03c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 29 01:19:47 np0005539510 ceph-osd[79822]: bdev(0x55fecfc03c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 29 01:19:47 np0005539510 ceph-osd[79822]: bdev(0x55fecfc03c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 29 01:19:47 np0005539510 ceph-osd[79822]: bdev(0x55fecfc03c00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 29 01:19:47 np0005539510 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Nov 29 01:19:47 np0005539510 ceph-osd[79822]: bdev(0x55fed0a0f000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 29 01:19:47 np0005539510 ceph-osd[79822]: bdev(0x55fed0a0f000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 29 01:19:47 np0005539510 ceph-osd[79822]: bdev(0x55fed0a0f000 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 29 01:19:47 np0005539510 ceph-osd[79822]: bdev(0x55fed0a0f000 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 29 01:19:47 np0005539510 ceph-osd[79822]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Nov 29 01:19:47 np0005539510 ceph-osd[79822]: bdev(0x55fed0a0f000 /var/lib/ceph/osd/ceph-2/block) close
Nov 29 01:19:47 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'status'
Nov 29 01:19:47 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/2969688060' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Nov 29 01:19:47 np0005539510 ceph-osd[79822]: bdev(0x55fecfc03c00 /var/lib/ceph/osd/ceph-2/block) close
Nov 29 01:19:47 np0005539510 ceph-mgr[77504]: mgr[py] Module status has missing NOTIFY_TYPES member
Nov 29 01:19:47 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'telegraf'
Nov 29 01:19:47 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:47.453+0000 7f62940bd140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Nov 29 01:19:47 np0005539510 ceph-osd[79822]: starting osd.2 osd_data /var/lib/ceph/osd/ceph-2 /var/lib/ceph/osd/ceph-2/journal
Nov 29 01:19:47 np0005539510 ceph-osd[79822]: load: jerasure load: lrc 
Nov 29 01:19:47 np0005539510 ceph-osd[79822]: bdev(0x55fed0a90c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 29 01:19:47 np0005539510 ceph-osd[79822]: bdev(0x55fed0a90c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 29 01:19:47 np0005539510 ceph-osd[79822]: bdev(0x55fed0a90c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 29 01:19:47 np0005539510 ceph-osd[79822]: bdev(0x55fed0a90c00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 29 01:19:47 np0005539510 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Nov 29 01:19:47 np0005539510 ceph-osd[79822]: bdev(0x55fed0a90c00 /var/lib/ceph/osd/ceph-2/block) close
Nov 29 01:19:47 np0005539510 ceph-mgr[77504]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Nov 29 01:19:47 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:47.700+0000 7f62940bd140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Nov 29 01:19:47 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'telemetry'
Nov 29 01:19:47 np0005539510 podman[79982]: 2025-11-29 06:19:47.86441752 +0000 UTC m=+0.068835155 container create d371ac0ff2da37e5369ad4839e9d10c642671b9ba4d56f11cef2542f7245e174 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_williamson, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True, org.label-schema.license=GPLv2)
Nov 29 01:19:47 np0005539510 ceph-osd[79822]: bdev(0x55fed0a90c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 29 01:19:47 np0005539510 ceph-osd[79822]: bdev(0x55fed0a90c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 29 01:19:47 np0005539510 ceph-osd[79822]: bdev(0x55fed0a90c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 29 01:19:47 np0005539510 ceph-osd[79822]: bdev(0x55fed0a90c00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 29 01:19:47 np0005539510 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Nov 29 01:19:47 np0005539510 ceph-osd[79822]: bdev(0x55fed0a90c00 /var/lib/ceph/osd/ceph-2/block) close
Nov 29 01:19:47 np0005539510 systemd[1]: Started libpod-conmon-d371ac0ff2da37e5369ad4839e9d10c642671b9ba4d56f11cef2542f7245e174.scope.
Nov 29 01:19:47 np0005539510 systemd[1]: Started libcrun container.
Nov 29 01:19:47 np0005539510 podman[79982]: 2025-11-29 06:19:47.847862076 +0000 UTC m=+0.052279731 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:19:47 np0005539510 podman[79982]: 2025-11-29 06:19:47.955877786 +0000 UTC m=+0.160295451 container init d371ac0ff2da37e5369ad4839e9d10c642671b9ba4d56f11cef2542f7245e174 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_williamson, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=reef)
Nov 29 01:19:47 np0005539510 podman[79982]: 2025-11-29 06:19:47.972583253 +0000 UTC m=+0.177000888 container start d371ac0ff2da37e5369ad4839e9d10c642671b9ba4d56f11cef2542f7245e174 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_williamson, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 01:19:47 np0005539510 podman[79982]: 2025-11-29 06:19:47.976350492 +0000 UTC m=+0.180768137 container attach d371ac0ff2da37e5369ad4839e9d10c642671b9ba4d56f11cef2542f7245e174 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_williamson, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507)
Nov 29 01:19:47 np0005539510 systemd[1]: libpod-d371ac0ff2da37e5369ad4839e9d10c642671b9ba4d56f11cef2542f7245e174.scope: Deactivated successfully.
Nov 29 01:19:47 np0005539510 musing_williamson[80002]: 167 167
Nov 29 01:19:47 np0005539510 conmon[80002]: conmon d371ac0ff2da37e5369a <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-d371ac0ff2da37e5369ad4839e9d10c642671b9ba4d56f11cef2542f7245e174.scope/container/memory.events
Nov 29 01:19:47 np0005539510 podman[79982]: 2025-11-29 06:19:47.980012388 +0000 UTC m=+0.184430043 container died d371ac0ff2da37e5369ad4839e9d10c642671b9ba4d56f11cef2542f7245e174 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_williamson, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 01:19:48 np0005539510 systemd[1]: var-lib-containers-storage-overlay-790d7b2f8c6a7e0d4a25b7ac07631a5fdb815c616996821c22007e17ff089846-merged.mount: Deactivated successfully.
Nov 29 01:19:48 np0005539510 podman[79982]: 2025-11-29 06:19:48.043560033 +0000 UTC m=+0.247977688 container remove d371ac0ff2da37e5369ad4839e9d10c642671b9ba4d56f11cef2542f7245e174 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_williamson, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Nov 29 01:19:48 np0005539510 systemd[1]: libpod-conmon-d371ac0ff2da37e5369ad4839e9d10c642671b9ba4d56f11cef2542f7245e174.scope: Deactivated successfully.
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: osd.2:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bdev(0x55fed0a90c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bdev(0x55fed0a90c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bdev(0x55fed0a90c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bdev(0x55fed0a90c00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bdev(0x55fed0a91400 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bdev(0x55fed0a91400 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bdev(0x55fed0a91400 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bdev(0x55fed0a91400 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bluefs mount
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bluefs mount shared_bdev_used = 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: RocksDB version: 7.9.2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Git sha 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Compile date 2025-05-06 23:30:25
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: DB SUMMARY
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: DB Session ID:  IRL5VW3ZF53NYTB339J7
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: CURRENT file:  CURRENT
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: IDENTITY file:  IDENTITY
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                         Options.error_if_exists: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.create_if_missing: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                         Options.paranoid_checks: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                                     Options.env: 0x55fed0a93f10
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                                      Options.fs: LegacyFileSystem
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                                Options.info_log: 0x55fecfc80c80
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.max_file_opening_threads: 16
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                              Options.statistics: (nil)
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                               Options.use_fsync: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.max_log_file_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                         Options.allow_fallocate: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.use_direct_reads: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.create_missing_column_families: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                              Options.db_log_dir: 
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                                 Options.wal_dir: db.wal
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.advise_random_on_open: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.write_buffer_manager: 0x55fed0ba8460
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                            Options.rate_limiter: (nil)
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.unordered_write: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                               Options.row_cache: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                              Options.wal_filter: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.allow_ingest_behind: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.two_write_queues: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.manual_wal_flush: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.wal_compression: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.atomic_flush: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.log_readahead_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.allow_data_in_errors: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.db_host_id: __hostname__
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.max_background_jobs: 4
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.max_background_compactions: -1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.max_subcompactions: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:           Options.writable_file_max_buffer_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.max_total_wal_size: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.max_open_files: -1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.bytes_per_sync: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.compaction_readahead_size: 2097152
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.max_background_flushes: -1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Compression algorithms supported:
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: #011kZSTD supported: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: #011kXpressCompression supported: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: #011kBZip2Compression supported: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: #011kLZ4Compression supported: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: #011kZlibCompression supported: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: #011kLZ4HCCompression supported: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: #011kSnappyCompression supported: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc806c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fecfc76dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc806c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fecfc76dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc806c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fecfc76dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc806c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fecfc76dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc806c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fecfc76dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc806c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fecfc76dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc806c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fecfc76dd0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc806e0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fecfc76430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc806e0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fecfc76430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc806e0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fecfc76430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 2f5dfbc6-dc11-46fc-bc15-f484fccc197b
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397188179388, "job": 1, "event": "recovery_started", "wal_files": [31]}
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397188179602, "job": 1, "event": "recovery_finished"}
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old nid_max 1025
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old blobid_max 10240
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta min_alloc_size 0x1000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: freelist init
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: freelist _read_cfg
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bluefs umount
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bdev(0x55fed0a91400 /var/lib/ceph/osd/ceph-2/block) close
Nov 29 01:19:48 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:48 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:48 np0005539510 podman[80025]: 2025-11-29 06:19:48.208443652 +0000 UTC m=+0.047231499 container create 5718781d011c6e64613911e13959e4269945706970090cfe8ff64e8e59430d67 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=clever_almeida, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Nov 29 01:19:48 np0005539510 systemd[1]: Started libpod-conmon-5718781d011c6e64613911e13959e4269945706970090cfe8ff64e8e59430d67.scope.
Nov 29 01:19:48 np0005539510 podman[80025]: 2025-11-29 06:19:48.188970521 +0000 UTC m=+0.027758458 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:19:48 np0005539510 systemd[1]: Started libcrun container.
Nov 29 01:19:48 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f13f68a42db64cb3a604ffc6f71e41d67c01ba21a9ad610540e0c80e1503601a/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:48 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f13f68a42db64cb3a604ffc6f71e41d67c01ba21a9ad610540e0c80e1503601a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:48 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f13f68a42db64cb3a604ffc6f71e41d67c01ba21a9ad610540e0c80e1503601a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:48 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f13f68a42db64cb3a604ffc6f71e41d67c01ba21a9ad610540e0c80e1503601a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 01:19:48 np0005539510 podman[80025]: 2025-11-29 06:19:48.305858944 +0000 UTC m=+0.144646831 container init 5718781d011c6e64613911e13959e4269945706970090cfe8ff64e8e59430d67 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=clever_almeida, org.label-schema.build-date=20250507, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 01:19:48 np0005539510 podman[80025]: 2025-11-29 06:19:48.313461403 +0000 UTC m=+0.152249280 container start 5718781d011c6e64613911e13959e4269945706970090cfe8ff64e8e59430d67 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=clever_almeida, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, OSD_FLAVOR=default)
Nov 29 01:19:48 np0005539510 podman[80025]: 2025-11-29 06:19:48.316732929 +0000 UTC m=+0.155520876 container attach 5718781d011c6e64613911e13959e4269945706970090cfe8ff64e8e59430d67 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=clever_almeida, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 01:19:48 np0005539510 ceph-mgr[77504]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Nov 29 01:19:48 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'test_orchestrator'
Nov 29 01:19:48 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:48.347+0000 7f62940bd140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bdev(0x55fed0a91400 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bdev(0x55fed0a91400 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bdev(0x55fed0a91400 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bdev(0x55fed0a91400 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bluefs mount
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bluefs mount shared_bdev_used = 4718592
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: RocksDB version: 7.9.2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Git sha 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Compile date 2025-05-06 23:30:25
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: DB SUMMARY
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: DB Session ID:  IRL5VW3ZF53NYTB339J6
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: CURRENT file:  CURRENT
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: IDENTITY file:  IDENTITY
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                         Options.error_if_exists: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.create_if_missing: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                         Options.paranoid_checks: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                                     Options.env: 0x55fecfdc84d0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                                      Options.fs: LegacyFileSystem
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                                Options.info_log: 0x55fecfc81920
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.max_file_opening_threads: 16
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                              Options.statistics: (nil)
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                               Options.use_fsync: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.max_log_file_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                         Options.allow_fallocate: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.use_direct_reads: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.create_missing_column_families: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                              Options.db_log_dir: 
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                                 Options.wal_dir: db.wal
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.advise_random_on_open: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.write_buffer_manager: 0x55fed0ba8460
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                            Options.rate_limiter: (nil)
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.unordered_write: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                               Options.row_cache: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                              Options.wal_filter: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.allow_ingest_behind: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.two_write_queues: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.manual_wal_flush: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.wal_compression: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.atomic_flush: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.log_readahead_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.allow_data_in_errors: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.db_host_id: __hostname__
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.max_background_jobs: 4
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.max_background_compactions: -1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.max_subcompactions: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:           Options.writable_file_max_buffer_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.max_total_wal_size: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.max_open_files: -1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.bytes_per_sync: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.compaction_readahead_size: 2097152
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.max_background_flushes: -1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Compression algorithms supported:
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: #011kZSTD supported: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: #011kXpressCompression supported: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: #011kBZip2Compression supported: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: #011kLZ4Compression supported: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: #011kZlibCompression supported: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: #011kLZ4HCCompression supported: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: #011kSnappyCompression supported: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc8a100)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fecfc77350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc8a100)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fecfc77350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc8a100)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fecfc77350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc8a100)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fecfc77350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc8a100)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fecfc77350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc8a100)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fecfc77350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc8a100)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fecfc77350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc8a040)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fecfc774b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc8a040)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fecfc774b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:           Options.merge_operator: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.compaction_filter_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.sst_partitioner_factory: None
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fecfc8a040)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55fecfc774b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.write_buffer_size: 16777216
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.max_write_buffer_number: 64
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.compression: LZ4
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.num_levels: 7
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.level: 32767
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.compression_opts.strategy: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                  Options.compression_opts.enabled: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.arena_block_size: 1048576
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.disable_auto_compactions: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.inplace_update_support: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.bloom_locality: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                    Options.max_successive_merges: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.paranoid_file_checks: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.force_consistency_checks: 1
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.report_bg_io_stats: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                               Options.ttl: 2592000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                       Options.enable_blob_files: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                           Options.min_blob_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                          Options.blob_file_size: 268435456
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb:                Options.blob_file_starting_level: 0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 2f5dfbc6-dc11-46fc-bc15-f484fccc197b
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397188458299, "job": 1, "event": "recovery_started", "wal_files": [31]}
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397188464068, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397188, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2f5dfbc6-dc11-46fc-bc15-f484fccc197b", "db_session_id": "IRL5VW3ZF53NYTB339J6", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397188466418, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1594, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397188, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2f5dfbc6-dc11-46fc-bc15-f484fccc197b", "db_session_id": "IRL5VW3ZF53NYTB339J6", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397188468743, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397188, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2f5dfbc6-dc11-46fc-bc15-f484fccc197b", "db_session_id": "IRL5VW3ZF53NYTB339J6", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397188470021, "job": 1, "event": "recovery_finished"}
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55fed0aabc00
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: DB pointer 0x55fed0b91a00
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super from 4, latest 4
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super done
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.1 total, 0.1 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/hello/cls_hello.cc:316: loading cls_hello
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: _get_class not permitted to load lua
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: _get_class not permitted to load sdk
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: _get_class not permitted to load test_remote_reads
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: osd.2 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: osd.2 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: osd.2 0 load_pgs
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: osd.2 0 load_pgs opened 0 pgs
Nov 29 01:19:48 np0005539510 ceph-osd[79822]: osd.2 0 log_to_monitors true
Nov 29 01:19:48 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2[79818]: 2025-11-29T06:19:48.496+0000 7fd15ca6c740 -1 osd.2 0 log_to_monitors true
Nov 29 01:19:48 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]} v 0) v1
Nov 29 01:19:48 np0005539510 ceph-mon[77142]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/60987518,v1:192.168.122.102:6801/60987518]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Nov 29 01:19:48 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:19:49 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:49.076+0000 7f62940bd140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Nov 29 01:19:49 np0005539510 ceph-mgr[77504]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Nov 29 01:19:49 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'volumes'
Nov 29 01:19:49 np0005539510 clever_almeida[80235]: {
Nov 29 01:19:49 np0005539510 clever_almeida[80235]:    "f86a06f9-a09f-46de-8440-929a842d2c66": {
Nov 29 01:19:49 np0005539510 clever_almeida[80235]:        "ceph_fsid": "336ec58c-893b-528f-a0c1-6ed1196bc047",
Nov 29 01:19:49 np0005539510 clever_almeida[80235]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Nov 29 01:19:49 np0005539510 clever_almeida[80235]:        "osd_id": 2,
Nov 29 01:19:49 np0005539510 clever_almeida[80235]:        "osd_uuid": "f86a06f9-a09f-46de-8440-929a842d2c66",
Nov 29 01:19:49 np0005539510 clever_almeida[80235]:        "type": "bluestore"
Nov 29 01:19:49 np0005539510 clever_almeida[80235]:    }
Nov 29 01:19:49 np0005539510 clever_almeida[80235]: }
Nov 29 01:19:49 np0005539510 systemd[1]: libpod-5718781d011c6e64613911e13959e4269945706970090cfe8ff64e8e59430d67.scope: Deactivated successfully.
Nov 29 01:19:49 np0005539510 systemd[1]: libpod-5718781d011c6e64613911e13959e4269945706970090cfe8ff64e8e59430d67.scope: Consumed 1.038s CPU time.
Nov 29 01:19:49 np0005539510 podman[80025]: 2025-11-29 06:19:49.346747354 +0000 UTC m=+1.185535201 container died 5718781d011c6e64613911e13959e4269945706970090cfe8ff64e8e59430d67 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=clever_almeida, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_REF=reef)
Nov 29 01:19:49 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Nov 29 01:19:49 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Nov 29 01:19:49 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e32 e32: 3 total, 2 up, 3 in
Nov 29 01:19:49 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]} v 0) v1
Nov 29 01:19:49 np0005539510 ceph-mon[77142]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/60987518,v1:192.168.122.102:6801/60987518]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]: dispatch
Nov 29 01:19:49 np0005539510 systemd[1]: var-lib-containers-storage-overlay-f13f68a42db64cb3a604ffc6f71e41d67c01ba21a9ad610540e0c80e1503601a-merged.mount: Deactivated successfully.
Nov 29 01:19:49 np0005539510 ceph-mon[77142]: from='osd.2 [v2:192.168.122.102:6800/60987518,v1:192.168.122.102:6801/60987518]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Nov 29 01:19:49 np0005539510 ceph-mon[77142]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Nov 29 01:19:49 np0005539510 podman[80025]: 2025-11-29 06:19:49.645970114 +0000 UTC m=+1.484758001 container remove 5718781d011c6e64613911e13959e4269945706970090cfe8ff64e8e59430d67 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=clever_almeida, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507)
Nov 29 01:19:49 np0005539510 systemd[1]: libpod-conmon-5718781d011c6e64613911e13959e4269945706970090cfe8ff64e8e59430d67.scope: Deactivated successfully.
Nov 29 01:19:49 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:49.890+0000 7f62940bd140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Nov 29 01:19:49 np0005539510 ceph-mgr[77504]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Nov 29 01:19:49 np0005539510 ceph-mgr[77504]: mgr[py] Loading python module 'zabbix'
Nov 29 01:19:50 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mgr-compute-2-ngsyhe[77500]: 2025-11-29T06:19:50.160+0000 7f62940bd140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Nov 29 01:19:50 np0005539510 ceph-mgr[77504]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Nov 29 01:19:50 np0005539510 ceph-mgr[77504]: ms_deliver_dispatch: unhandled message 0x563492cf5080 mon_map magic: 0 v1 from mon.0 v2:192.168.122.100:3300/0
Nov 29 01:19:50 np0005539510 ceph-mgr[77504]: client.0 ms_handle_reset on v2:192.168.122.100:6800/1221624088
Nov 29 01:19:51 np0005539510 ceph-mon[77142]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Nov 29 01:19:51 np0005539510 ceph-mon[77142]: from='osd.2 [v2:192.168.122.102:6800/60987518,v1:192.168.122.102:6801/60987518]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]: dispatch
Nov 29 01:19:51 np0005539510 ceph-mon[77142]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]: dispatch
Nov 29 01:19:51 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:51 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:51 np0005539510 ceph-mgr[77504]: client.0 ms_handle_reset on v2:192.168.122.100:6800/1221624088
Nov 29 01:19:52 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e33 e33: 3 total, 2 up, 3 in
Nov 29 01:19:52 np0005539510 ceph-osd[79822]: osd.2 0 done with init, starting boot process
Nov 29 01:19:52 np0005539510 ceph-osd[79822]: osd.2 0 start_boot
Nov 29 01:19:52 np0005539510 ceph-osd[79822]: osd.2 0 maybe_override_options_for_qos osd_max_backfills set to 1
Nov 29 01:19:52 np0005539510 ceph-osd[79822]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Nov 29 01:19:52 np0005539510 ceph-osd[79822]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Nov 29 01:19:52 np0005539510 ceph-osd[79822]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Nov 29 01:19:52 np0005539510 ceph-osd[79822]: osd.2 0  bench count 12288000 bsize 4 KiB
Nov 29 01:19:52 np0005539510 podman[80707]: 2025-11-29 06:19:52.117091945 +0000 UTC m=+0.943771607 container exec 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 01:19:52 np0005539510 podman[80707]: 2025-11-29 06:19:52.465401501 +0000 UTC m=+1.292081083 container exec_died 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 01:19:54 np0005539510 ceph-mon[77142]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]': finished
Nov 29 01:19:54 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:54 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:54 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e33 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:19:56 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e34 e34: 3 total, 2 up, 3 in
Nov 29 01:19:58 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 01:19:58 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]': finished
Nov 29 01:19:58 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:58 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 01:19:58 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:19:58 np0005539510 podman[81069]: 2025-11-29 06:19:58.375402048 +0000 UTC m=+0.024927735 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:19:59 np0005539510 podman[81069]: 2025-11-29 06:19:59.206975763 +0000 UTC m=+0.856501430 container create e09307d1fccdc9ee4ab3eb55ea9c2702937cb1e6aab997bc2bc04893993ff4c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_dubinsky, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Nov 29 01:19:59 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e35 e35: 3 total, 2 up, 3 in
Nov 29 01:19:59 np0005539510 systemd[1]: Started libpod-conmon-e09307d1fccdc9ee4ab3eb55ea9c2702937cb1e6aab997bc2bc04893993ff4c8.scope.
Nov 29 01:19:59 np0005539510 systemd[1]: Started libcrun container.
Nov 29 01:19:59 np0005539510 podman[81069]: 2025-11-29 06:19:59.343026437 +0000 UTC m=+0.992552134 container init e09307d1fccdc9ee4ab3eb55ea9c2702937cb1e6aab997bc2bc04893993ff4c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_dubinsky, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 01:19:59 np0005539510 podman[81069]: 2025-11-29 06:19:59.350186075 +0000 UTC m=+0.999711742 container start e09307d1fccdc9ee4ab3eb55ea9c2702937cb1e6aab997bc2bc04893993ff4c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_dubinsky, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Nov 29 01:19:59 np0005539510 elegant_dubinsky[81085]: 167 167
Nov 29 01:19:59 np0005539510 systemd[1]: libpod-e09307d1fccdc9ee4ab3eb55ea9c2702937cb1e6aab997bc2bc04893993ff4c8.scope: Deactivated successfully.
Nov 29 01:19:59 np0005539510 podman[81069]: 2025-11-29 06:19:59.389208017 +0000 UTC m=+1.038733704 container attach e09307d1fccdc9ee4ab3eb55ea9c2702937cb1e6aab997bc2bc04893993ff4c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_dubinsky, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Nov 29 01:19:59 np0005539510 podman[81069]: 2025-11-29 06:19:59.389641169 +0000 UTC m=+1.039166836 container died e09307d1fccdc9ee4ab3eb55ea9c2702937cb1e6aab997bc2bc04893993ff4c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_dubinsky, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 01:19:59 np0005539510 systemd[1]: var-lib-containers-storage-overlay-3f65f410c47ef2da0e7a20b01738f1e234ab3c379e803a03a343a02133afc89f-merged.mount: Deactivated successfully.
Nov 29 01:19:59 np0005539510 podman[81069]: 2025-11-29 06:19:59.661544282 +0000 UTC m=+1.311069949 container remove e09307d1fccdc9ee4ab3eb55ea9c2702937cb1e6aab997bc2bc04893993ff4c8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=elegant_dubinsky, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=reef)
Nov 29 01:19:59 np0005539510 systemd[1]: libpod-conmon-e09307d1fccdc9ee4ab3eb55ea9c2702937cb1e6aab997bc2bc04893993ff4c8.scope: Deactivated successfully.
Nov 29 01:19:59 np0005539510 podman[81108]: 2025-11-29 06:19:59.820062335 +0000 UTC m=+0.033421856 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:20:00 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:20:01 np0005539510 podman[81108]: 2025-11-29 06:20:01.131421251 +0000 UTC m=+1.344780722 container create 19349e7d1d1308be3c88f614079b7cdb5b9ef9038912eed05fa3d397b2a757ee (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_napier, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef)
Nov 29 01:20:01 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e35 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:20:02 np0005539510 systemd[1]: Started libpod-conmon-19349e7d1d1308be3c88f614079b7cdb5b9ef9038912eed05fa3d397b2a757ee.scope.
Nov 29 01:20:02 np0005539510 systemd[1]: Started libcrun container.
Nov 29 01:20:02 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a8cc040208d8bd265fcf207c812a4c015bfceb9831d466d132bf115ad4047e7/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 01:20:02 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a8cc040208d8bd265fcf207c812a4c015bfceb9831d466d132bf115ad4047e7/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 01:20:02 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a8cc040208d8bd265fcf207c812a4c015bfceb9831d466d132bf115ad4047e7/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:20:02 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a8cc040208d8bd265fcf207c812a4c015bfceb9831d466d132bf115ad4047e7/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 01:20:02 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e36 e36: 3 total, 2 up, 3 in
Nov 29 01:20:02 np0005539510 podman[81108]: 2025-11-29 06:20:02.860531032 +0000 UTC m=+3.073890483 container init 19349e7d1d1308be3c88f614079b7cdb5b9ef9038912eed05fa3d397b2a757ee (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_napier, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Nov 29 01:20:02 np0005539510 podman[81108]: 2025-11-29 06:20:02.869642851 +0000 UTC m=+3.083002282 container start 19349e7d1d1308be3c88f614079b7cdb5b9ef9038912eed05fa3d397b2a757ee (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_napier, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250507)
Nov 29 01:20:02 np0005539510 podman[81108]: 2025-11-29 06:20:02.899278587 +0000 UTC m=+3.112638018 container attach 19349e7d1d1308be3c88f614079b7cdb5b9ef9038912eed05fa3d397b2a757ee (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_napier, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:20:03 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Nov 29 01:20:03 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 01:20:03 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:20:03 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:20:03 np0005539510 ceph-mon[77142]: Health detail: HEALTH_ERR 1 filesystem is offline; 1 filesystem is online with fewer MDS than max_mds
Nov 29 01:20:03 np0005539510 ceph-mon[77142]: [ERR] MDS_ALL_DOWN: 1 filesystem is offline
Nov 29 01:20:03 np0005539510 ceph-mon[77142]:    fs cephfs is offline because no MDS is active for it.
Nov 29 01:20:03 np0005539510 ceph-mon[77142]: [WRN] MDS_UP_LESS_THAN_MAX: 1 filesystem is online with fewer MDS than max_mds
Nov 29 01:20:03 np0005539510 ceph-mon[77142]:    fs cephfs has 0 MDS online, but wants 1
Nov 29 01:20:04 np0005539510 stoic_napier[81124]: [
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:    {
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:        "available": false,
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:        "ceph_device": false,
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:        "device_id": "QEMU_DVD-ROM_QM00001",
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:        "lsm_data": {},
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:        "lvs": [],
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:        "path": "/dev/sr0",
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:        "rejected_reasons": [
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:            "Insufficient space (<5GB)",
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:            "Has a FileSystem"
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:        ],
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:        "sys_api": {
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:            "actuators": null,
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:            "device_nodes": "sr0",
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:            "devname": "sr0",
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:            "human_readable_size": "482.00 KB",
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:            "id_bus": "ata",
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:            "model": "QEMU DVD-ROM",
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:            "nr_requests": "2",
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:            "parent": "/dev/sr0",
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:            "partitions": {},
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:            "path": "/dev/sr0",
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:            "removable": "1",
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:            "rev": "2.5+",
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:            "ro": "0",
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:            "rotational": "1",
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:            "sas_address": "",
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:            "sas_device_handle": "",
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:            "scheduler_mode": "mq-deadline",
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:            "sectors": 0,
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:            "sectorsize": "2048",
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:            "size": 493568.0,
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:            "support_discard": "2048",
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:            "type": "disk",
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:            "vendor": "QEMU"
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:        }
Nov 29 01:20:04 np0005539510 stoic_napier[81124]:    }
Nov 29 01:20:04 np0005539510 stoic_napier[81124]: ]
Nov 29 01:20:04 np0005539510 systemd[1]: libpod-19349e7d1d1308be3c88f614079b7cdb5b9ef9038912eed05fa3d397b2a757ee.scope: Deactivated successfully.
Nov 29 01:20:04 np0005539510 podman[81108]: 2025-11-29 06:20:04.087331103 +0000 UTC m=+4.300690534 container died 19349e7d1d1308be3c88f614079b7cdb5b9ef9038912eed05fa3d397b2a757ee (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_napier, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Nov 29 01:20:04 np0005539510 systemd[1]: libpod-19349e7d1d1308be3c88f614079b7cdb5b9ef9038912eed05fa3d397b2a757ee.scope: Consumed 1.214s CPU time.
Nov 29 01:20:04 np0005539510 systemd[1]: var-lib-containers-storage-overlay-1a8cc040208d8bd265fcf207c812a4c015bfceb9831d466d132bf115ad4047e7-merged.mount: Deactivated successfully.
Nov 29 01:20:04 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e37 e37: 3 total, 2 up, 3 in
Nov 29 01:20:04 np0005539510 podman[81108]: 2025-11-29 06:20:04.201958316 +0000 UTC m=+4.415317747 container remove 19349e7d1d1308be3c88f614079b7cdb5b9ef9038912eed05fa3d397b2a757ee (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_napier, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.license=GPLv2)
Nov 29 01:20:04 np0005539510 systemd[1]: libpod-conmon-19349e7d1d1308be3c88f614079b7cdb5b9ef9038912eed05fa3d397b2a757ee.scope: Deactivated successfully.
Nov 29 01:20:05 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:20:05 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Nov 29 01:20:05 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:20:05 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:20:05 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"}]: dispatch
Nov 29 01:20:05 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:20:05 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:20:05 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"}]': finished
Nov 29 01:20:05 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:20:06 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e38 e38: 3 total, 2 up, 3 in
Nov 29 01:20:06 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e38 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:20:09 np0005539510 ceph-osd[79822]: osd.2 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 5.398 iops: 1381.921 elapsed_sec: 2.171
Nov 29 01:20:09 np0005539510 ceph-osd[79822]: log_channel(cluster) log [WRN] : OSD bench result of 1381.921175 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Nov 29 01:20:09 np0005539510 ceph-osd[79822]: osd.2 0 waiting for initial osdmap
Nov 29 01:20:09 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2[79818]: 2025-11-29T06:20:09.161+0000 7fd159203640 -1 osd.2 0 waiting for initial osdmap
Nov 29 01:20:09 np0005539510 ceph-osd[79822]: osd.2 38 crush map has features 288514051259236352, adjusting msgr requires for clients
Nov 29 01:20:09 np0005539510 ceph-osd[79822]: osd.2 38 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Nov 29 01:20:09 np0005539510 ceph-osd[79822]: osd.2 38 crush map has features 3314933000852226048, adjusting msgr requires for osds
Nov 29 01:20:09 np0005539510 ceph-osd[79822]: osd.2 38 check_osdmap_features require_osd_release unknown -> reef
Nov 29 01:20:09 np0005539510 ceph-osd[79822]: osd.2 38 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Nov 29 01:20:09 np0005539510 ceph-osd[79822]: osd.2 38 set_numa_affinity not setting numa affinity
Nov 29 01:20:09 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-osd-2[79818]: 2025-11-29T06:20:09.192+0000 7fd154014640 -1 osd.2 38 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Nov 29 01:20:09 np0005539510 ceph-osd[79822]: osd.2 38 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial
Nov 29 01:20:10 np0005539510 ceph-osd[79822]: osd.2 38 tick checking mon for new map
Nov 29 01:20:11 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e38 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:20:13 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 01:20:13 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:13 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:13 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Nov 29 01:20:13 np0005539510 ceph-mon[77142]: Adjusting osd_memory_target on compute-2 to 128.0M
Nov 29 01:20:13 np0005539510 ceph-mon[77142]: Unable to set osd_memory_target on compute-2 to 134217728: error parsing value: Value '134217728' is below minimum 939524096
Nov 29 01:20:13 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:20:13 np0005539510 ceph-mon[77142]: Updating compute-0:/etc/ceph/ceph.conf
Nov 29 01:20:13 np0005539510 ceph-mon[77142]: Updating compute-1:/etc/ceph/ceph.conf
Nov 29 01:20:13 np0005539510 ceph-mon[77142]: Updating compute-2:/etc/ceph/ceph.conf
Nov 29 01:20:13 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]: dispatch
Nov 29 01:20:14 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e39 e39: 3 total, 2 up, 3 in
Nov 29 01:20:16 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:20:16 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Nov 29 01:20:16 np0005539510 ceph-mon[77142]: Updating compute-0:/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf
Nov 29 01:20:16 np0005539510 ceph-mon[77142]: Updating compute-1:/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf
Nov 29 01:20:16 np0005539510 ceph-mon[77142]: Updating compute-2:/var/lib/ceph/336ec58c-893b-528f-a0c1-6ed1196bc047/config/ceph.conf
Nov 29 01:20:16 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:20:16 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]: dispatch
Nov 29 01:20:16 np0005539510 ceph-mon[77142]: OSD bench result of 1381.921175 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Nov 29 01:20:16 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:20:16 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]: dispatch
Nov 29 01:20:16 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:20:16 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]: dispatch
Nov 29 01:20:16 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:20:16 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]: dispatch
Nov 29 01:20:16 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]': finished
Nov 29 01:20:16 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e39 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:20:17 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e40 e40: 3 total, 3 up, 3 in
Nov 29 01:20:17 np0005539510 ceph-osd[79822]: osd.2 40 state: booting -> active
Nov 29 01:20:17 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[2.1d( empty local-lis/les=0/0 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:17 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[2.b( empty local-lis/les=0/0 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:17 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[2.5( empty local-lis/les=0/0 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:17 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[2.1c( empty local-lis/les=0/0 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:17 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[2.f( empty local-lis/les=0/0 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:17 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[2.18( empty local-lis/les=0/0 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:17 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[2.12( empty local-lis/les=0/0 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.c( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.a( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.f( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.2( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.9( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.4( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.2( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.4( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.1( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.7( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.1e( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.1a( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.18( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.1c( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:19 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:19 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:20:19 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:19 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:20:19 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]': finished
Nov 29 01:20:19 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:20:19 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]': finished
Nov 29 01:20:19 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:20:19 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]': finished
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.e( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:20:19 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]': finished
Nov 29 01:20:19 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:19 np0005539510 ceph-mon[77142]: osd.2 [v2:192.168.122.102:6800/60987518,v1:192.168.122.102:6801/60987518] boot
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[2.1b( empty local-lis/les=0/0 n=0 ec=17/12 lis/c=24/24 les/c/f=25/25/0 sis=40) [2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[2.15( empty local-lis/les=0/0 n=0 ec=17/12 lis/c=24/24 les/c/f=25/25/0 sis=40) [2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.12( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.14( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[2.13( empty local-lis/les=0/0 n=0 ec=17/12 lis/c=24/24 les/c/f=25/25/0 sis=40) [2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.12( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.14( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.17( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[2.10( empty local-lis/les=0/0 n=0 ec=17/12 lis/c=24/24 les/c/f=25/25/0 sis=40) [2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.11( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.5( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.3( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.8( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.18( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.6( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.1e( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.1( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.0( empty local-lis/les=0/0 n=0 ec=14/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.7( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.0( empty local-lis/les=0/0 n=0 ec=18/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.6( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.3( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.5( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.d( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.19( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.1f( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[2.c( empty local-lis/les=0/0 n=0 ec=17/12 lis/c=24/24 les/c/f=25/25/0 sis=40) [2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.a( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.c( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[2.d( empty local-lis/les=0/0 n=0 ec=17/12 lis/c=24/24 les/c/f=25/25/0 sis=40) [2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.d( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[2.a( empty local-lis/les=0/0 n=0 ec=17/12 lis/c=24/24 les/c/f=25/25/0 sis=40) [2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.9( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.b( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.b( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.f( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.16( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.8( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.10( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.15( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.15( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.13( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.10( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.16( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.13( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.e( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.17( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.11( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.1d( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.1a( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.1b( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.19( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.1b( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.1c( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[5.1f( empty local-lis/les=0/0 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 40 pg[3.1d( empty local-lis/les=0/0 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:21 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e40 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:20:26 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e40 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:20:27 np0005539510 systemd[1]: session-19.scope: Deactivated successfully.
Nov 29 01:20:27 np0005539510 systemd[1]: session-19.scope: Consumed 9.258s CPU time.
Nov 29 01:20:28 np0005539510 systemd-logind[784]: Session 19 logged out. Waiting for processes to exit.
Nov 29 01:20:28 np0005539510 systemd-logind[784]: Removed session 19.
Nov 29 01:20:29 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e41 e41: 3 total, 3 up, 3 in
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.10( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.16( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.9( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[2.d( empty local-lis/les=40/41 n=0 ec=17/12 lis/c=24/24 les/c/f=25/25/0 sis=40) [2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[2.a( empty local-lis/les=40/41 n=0 ec=17/12 lis/c=24/24 les/c/f=25/25/0 sis=40) [2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.b( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.c( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.1f( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.b( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.19( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[2.c( empty local-lis/les=40/41 n=0 ec=17/12 lis/c=24/24 les/c/f=25/25/0 sis=40) [2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.d( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.d( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.3( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.a( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.5( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.1( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.6( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.0( empty local-lis/les=40/41 n=0 ec=18/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.0( empty local-lis/les=40/41 n=0 ec=14/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.6( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.7( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.1e( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.18( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.5( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.3( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.15( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.9( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.13( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.19( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.e( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.1f( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.f( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.8( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.1b( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.1d( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.2( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.2( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.4( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.a( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.c( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.4( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[2.10( empty local-lis/les=40/41 n=0 ec=17/12 lis/c=24/24 les/c/f=25/25/0 sis=40) [2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.17( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.11( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.1c( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.7( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.1( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.1e( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.1a( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.f( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.1a( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.1c( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[2.1b( empty local-lis/les=40/41 n=0 ec=17/12 lis/c=24/24 les/c/f=25/25/0 sis=40) [2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[2.15( empty local-lis/les=40/41 n=0 ec=17/12 lis/c=24/24 les/c/f=25/25/0 sis=40) [2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.12( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.14( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.e( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.14( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.12( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.18( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[2.13( empty local-lis/les=40/41 n=0 ec=17/12 lis/c=24/24 les/c/f=25/25/0 sis=40) [2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.15( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.8( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.13( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.16( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.10( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.11( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[5.1d( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=18/18 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[18,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.1b( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[2.5( empty local-lis/les=40/41 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[3.17( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=14/14 les/c/f=15/15/0 sis=40) [2] r=0 lpr=40 pi=[14,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[2.1d( empty local-lis/les=40/41 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[2.18( empty local-lis/les=40/41 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[2.12( empty local-lis/les=40/41 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[2.b( empty local-lis/les=40/41 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[2.1c( empty local-lis/les=40/41 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 41 pg[2.f( empty local-lis/les=40/41 n=0 ec=17/12 lis/c=17/17 les/c/f=19/19/0 sis=40) [2] r=0 lpr=40 pi=[17,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:29 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:29 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:30 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.c deep-scrub starts
Nov 29 01:20:30 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.c deep-scrub ok
Nov 29 01:20:31 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e41 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:20:31 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:20:31 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:31 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:31 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:20:32 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.2 scrub starts
Nov 29 01:20:32 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.2 scrub ok
Nov 29 01:20:34 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.a deep-scrub starts
Nov 29 01:20:34 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.a deep-scrub ok
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.4 scrub starts
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.4 scrub ok
Nov 29 01:20:35 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e42 e42: 3 total, 3 up, 3 in
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[7.14( empty local-lis/les=0/0 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[4.8( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[4.2( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[4.1( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[4.19( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[4.14( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[7.1d( empty local-lis/les=0/0 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[4.9( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[7.a( empty local-lis/les=0/0 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[6.1( empty local-lis/les=0/0 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[4.3( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[4.6( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[7.5( empty local-lis/les=0/0 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[4.1f( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[4.1d( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[4.15( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[7.16( empty local-lis/les=0/0 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[7.11( empty local-lis/les=0/0 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[4.1c( empty local-lis/les=0/0 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[7.1f( empty local-lis/les=0/0 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.1d( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.524729729s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.942543030s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.16( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.524394035s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.942268372s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.11( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.524551392s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.942417145s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.17( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.525029182s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.942920685s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.16( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.524342537s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.942268372s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.1d( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.524610519s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.942543030s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.17( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.524973869s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.942920685s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.10( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.524315834s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.942390442s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.11( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.524477005s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.942417145s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.10( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.524250984s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.942390442s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.12( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.521661758s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.939975739s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.14( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.521576881s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.939945221s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.14( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.521528244s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.939945221s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.12( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.521575928s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.939975739s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.1c( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.521255493s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.939750671s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.14( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.521340370s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.939865112s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.18( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.521452904s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.939983368s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.1c( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.521198273s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.939750671s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.14( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.521246910s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.939865112s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.18( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.521417618s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.939983368s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.7( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.519620895s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.938369751s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.7( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.519574165s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.938369751s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.1c( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.519407272s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.938304901s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.1c( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.519376755s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.938304901s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.1( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.519334793s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.938304901s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.17( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.519217491s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.938220978s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.17( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.519193649s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.938220978s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.1( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.519282341s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.938304901s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.1e( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.520560265s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.939620972s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.a( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.518817902s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.938079834s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.a( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.518780708s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.938079834s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.4( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.518674850s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.938003540s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.2( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.518661499s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937999725s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.2( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.518634796s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937999725s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.1e( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.520272255s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.939620972s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.4( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.518590927s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.938003540s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.2( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.518420219s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937969208s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.1b( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.518275261s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937923431s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.2( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.518387794s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937969208s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.1b( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.518247604s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937923431s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.1f( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.518118858s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937870026s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.13( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517918587s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937751770s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.1f( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.518052101s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937870026s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.13( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517898560s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937751770s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.15( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517808914s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937728882s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.19( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517912865s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937812805s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.15( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517782211s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937728882s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.19( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517834663s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937812805s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.f( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517837524s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937881470s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.5( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517531395s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937652588s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.f( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517782211s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937881470s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.c( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517941475s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.938095093s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.5( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517477989s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937652588s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.c( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517900467s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.938095093s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.18( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517189026s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937469482s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.3( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517422676s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937702179s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.18( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517161369s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937469482s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.1e( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517057419s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937427521s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.1e( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517015457s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937427521s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.6( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.516923904s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937355042s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.1( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.516735077s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937217712s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.3( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.517365456s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937702179s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.1( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.516711235s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937217712s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.7( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.516699791s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937400818s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.3( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.516376495s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937091827s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.6( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.516853333s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937355042s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.6( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.516497612s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937255859s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.3( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.516351700s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937091827s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.7( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.516664505s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937400818s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.5( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.516343117s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937183380s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.5( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.516312599s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937183380s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.6( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.516367912s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937255859s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.1f( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.515216827s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.936321259s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.1f( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.515192032s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.936321259s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.a( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.515998840s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937152863s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.d( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.515906334s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.937076569s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.a( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.515972137s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937152863s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.d( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.515867233s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.937076569s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.19( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.515160561s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.936397552s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.c( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.514899254s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.936237335s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.19( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.515081406s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.936397552s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.c( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.514871597s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.936237335s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.9( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.514591217s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.936069489s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.9( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.514548302s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.936069489s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.f( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.518133163s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.939689636s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.f( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.518104553s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.939689636s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.10( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.509164810s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.930801392s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.16( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.509134293s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.930805206s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.b( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.514425278s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active pruub 56.936149597s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.b( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.514390945s) [1] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.936149597s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[3.10( empty local-lis/les=40/41 n=0 ec=36/14 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.509114265s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.930801392s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:35 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 42 pg[5.16( empty local-lis/les=40/41 n=0 ec=37/18 lis/c=40/40 les/c/f=41/41/0 sis=42 pruub=9.509103775s) [0] r=-1 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 56.930805206s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:20:36 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 01:20:36 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 01:20:36 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"}]: dispatch
Nov 29 01:20:36 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 01:20:36 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 01:20:36 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 01:20:36 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 01:20:36 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"}]': finished
Nov 29 01:20:36 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 01:20:36 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 01:20:36 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:20:36 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e43 e43: 3 total, 3 up, 3 in
Nov 29 01:20:38 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e44 e44: 3 total, 3 up, 3 in
Nov 29 01:20:38 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[4.1c( empty local-lis/les=42/44 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:38 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[7.1f( empty local-lis/les=42/44 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:38 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[4.15( empty local-lis/les=42/44 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:38 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[7.16( empty local-lis/les=42/44 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:38 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[7.11( empty local-lis/les=42/44 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:38 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[4.1f( empty local-lis/les=42/44 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:38 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[7.5( empty local-lis/les=42/44 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:38 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[4.1d( empty local-lis/les=42/44 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:38 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[6.1( empty local-lis/les=42/44 n=0 ec=39/20 lis/c=39/39 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[39,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:38 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[4.3( empty local-lis/les=42/44 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:38 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[4.6( empty local-lis/les=42/44 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:38 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[7.1d( empty local-lis/les=42/44 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:38 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[4.9( empty local-lis/les=42/44 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:38 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[7.a( empty local-lis/les=42/44 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:38 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[4.14( empty local-lis/les=42/44 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:38 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[4.1( empty local-lis/les=42/44 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:38 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[4.19( empty local-lis/les=42/44 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:38 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[4.2( empty local-lis/les=42/44 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:38 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[7.14( empty local-lis/les=42/44 n=0 ec=40/21 lis/c=40/40 les/c/f=41/41/0 sis=42) [2] r=0 lpr=42 pi=[40,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:38 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 44 pg[4.8( empty local-lis/les=42/44 n=0 ec=36/16 lis/c=36/36 les/c/f=37/37/0 sis=42) [2] r=0 lpr=42 pi=[36,42)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:20:38 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.9 scrub starts
Nov 29 01:20:38 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.9 scrub ok
Nov 29 01:20:41 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:41 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:41 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.pkypgd", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Nov 29 01:20:41 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e44 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:20:42 np0005539510 podman[83286]: 2025-11-29 06:20:41.910539765 +0000 UTC m=+0.021892852 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:20:42 np0005539510 podman[83286]: 2025-11-29 06:20:42.196496928 +0000 UTC m=+0.307849995 container create 2187078c803f9cc6b8bdbcbaf01a1f798fc4f79fc03c278b2df884bf1f04e18f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_thompson, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0)
Nov 29 01:20:42 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.pkypgd", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Nov 29 01:20:42 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:42 np0005539510 ceph-mon[77142]: Deploying daemon rgw.rgw.compute-2.pkypgd on compute-2
Nov 29 01:20:42 np0005539510 systemd[1]: Started libpod-conmon-2187078c803f9cc6b8bdbcbaf01a1f798fc4f79fc03c278b2df884bf1f04e18f.scope.
Nov 29 01:20:42 np0005539510 systemd[1]: Started libcrun container.
Nov 29 01:20:42 np0005539510 podman[83286]: 2025-11-29 06:20:42.441639148 +0000 UTC m=+0.552992225 container init 2187078c803f9cc6b8bdbcbaf01a1f798fc4f79fc03c278b2df884bf1f04e18f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_thompson, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 01:20:42 np0005539510 podman[83286]: 2025-11-29 06:20:42.454463222 +0000 UTC m=+0.565816259 container start 2187078c803f9cc6b8bdbcbaf01a1f798fc4f79fc03c278b2df884bf1f04e18f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_thompson, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 01:20:42 np0005539510 podman[83286]: 2025-11-29 06:20:42.459520404 +0000 UTC m=+0.570873491 container attach 2187078c803f9cc6b8bdbcbaf01a1f798fc4f79fc03c278b2df884bf1f04e18f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_thompson, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Nov 29 01:20:42 np0005539510 zen_thompson[83303]: 167 167
Nov 29 01:20:42 np0005539510 systemd[1]: libpod-2187078c803f9cc6b8bdbcbaf01a1f798fc4f79fc03c278b2df884bf1f04e18f.scope: Deactivated successfully.
Nov 29 01:20:42 np0005539510 podman[83286]: 2025-11-29 06:20:42.463231111 +0000 UTC m=+0.574584148 container died 2187078c803f9cc6b8bdbcbaf01a1f798fc4f79fc03c278b2df884bf1f04e18f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_thompson, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 01:20:42 np0005539510 systemd[1]: var-lib-containers-storage-overlay-54fb1e567630240ce82ee3154f21ae2e072a9f814c027988ca3eabb3df1b25f0-merged.mount: Deactivated successfully.
Nov 29 01:20:42 np0005539510 podman[83286]: 2025-11-29 06:20:42.514252841 +0000 UTC m=+0.625605868 container remove 2187078c803f9cc6b8bdbcbaf01a1f798fc4f79fc03c278b2df884bf1f04e18f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=zen_thompson, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:20:42 np0005539510 systemd[1]: libpod-conmon-2187078c803f9cc6b8bdbcbaf01a1f798fc4f79fc03c278b2df884bf1f04e18f.scope: Deactivated successfully.
Nov 29 01:20:42 np0005539510 systemd[1]: Reloading.
Nov 29 01:20:42 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:20:42 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:20:42 np0005539510 systemd[1]: Reloading.
Nov 29 01:20:42 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:20:42 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:20:43 np0005539510 systemd[1]: Starting Ceph rgw.rgw.compute-2.pkypgd for 336ec58c-893b-528f-a0c1-6ed1196bc047...
Nov 29 01:20:43 np0005539510 podman[83447]: 2025-11-29 06:20:43.331066571 +0000 UTC m=+0.026905702 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:20:43 np0005539510 podman[83447]: 2025-11-29 06:20:43.625197437 +0000 UTC m=+0.321036508 container create b699b579e3e50ec8185ae9d6b5dbb2893bbc7d33eee2cb31319a7c5a262da294 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-rgw-rgw-compute-2-pkypgd, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef)
Nov 29 01:20:43 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c0cddee5474a74f27c8bbeb36cf940d6f0cabd276d3eda2b505605f0a3eff515/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 01:20:43 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c0cddee5474a74f27c8bbeb36cf940d6f0cabd276d3eda2b505605f0a3eff515/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:20:43 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c0cddee5474a74f27c8bbeb36cf940d6f0cabd276d3eda2b505605f0a3eff515/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 01:20:43 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c0cddee5474a74f27c8bbeb36cf940d6f0cabd276d3eda2b505605f0a3eff515/merged/var/lib/ceph/radosgw/ceph-rgw.rgw.compute-2.pkypgd supports timestamps until 2038 (0x7fffffff)
Nov 29 01:20:43 np0005539510 podman[83447]: 2025-11-29 06:20:43.707028701 +0000 UTC m=+0.402867742 container init b699b579e3e50ec8185ae9d6b5dbb2893bbc7d33eee2cb31319a7c5a262da294 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-rgw-rgw-compute-2-pkypgd, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Nov 29 01:20:43 np0005539510 podman[83447]: 2025-11-29 06:20:43.716332693 +0000 UTC m=+0.412171744 container start b699b579e3e50ec8185ae9d6b5dbb2893bbc7d33eee2cb31319a7c5a262da294 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-rgw-rgw-compute-2-pkypgd, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 01:20:43 np0005539510 bash[83447]: b699b579e3e50ec8185ae9d6b5dbb2893bbc7d33eee2cb31319a7c5a262da294
Nov 29 01:20:43 np0005539510 systemd[1]: Started Ceph rgw.rgw.compute-2.pkypgd for 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 01:20:43 np0005539510 radosgw[83467]: deferred set uid:gid to 167:167 (ceph:ceph)
Nov 29 01:20:43 np0005539510 radosgw[83467]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process radosgw, pid 2
Nov 29 01:20:43 np0005539510 radosgw[83467]: framework: beast
Nov 29 01:20:43 np0005539510 radosgw[83467]: framework conf key: endpoint, val: 192.168.122.102:8082
Nov 29 01:20:43 np0005539510 radosgw[83467]: init_numa not setting numa affinity
Nov 29 01:20:44 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.1a scrub starts
Nov 29 01:20:44 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.1a scrub ok
Nov 29 01:20:46 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e45 e45: 3 total, 3 up, 3 in
Nov 29 01:20:46 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e45 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:20:46 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"} v 0) v1
Nov 29 01:20:46 np0005539510 ceph-mon[77142]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/1290272359' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Nov 29 01:20:47 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e46 e46: 3 total, 3 up, 3 in
Nov 29 01:20:47 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:47 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.102:0/1290272359' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Nov 29 01:20:47 np0005539510 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Nov 29 01:20:48 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:48 np0005539510 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd='[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]': finished
Nov 29 01:20:48 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:48 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.cbugbv", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Nov 29 01:20:48 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.cbugbv", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Nov 29 01:20:48 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:48 np0005539510 ceph-mon[77142]: Deploying daemon rgw.rgw.compute-1.cbugbv on compute-1
Nov 29 01:20:49 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e47 e47: 3 total, 3 up, 3 in
Nov 29 01:20:49 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"} v 0) v1
Nov 29 01:20:49 np0005539510 ceph-mon[77142]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/1290272359' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Nov 29 01:20:50 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.102:0/1290272359' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Nov 29 01:20:50 np0005539510 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Nov 29 01:20:50 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e48 e48: 3 total, 3 up, 3 in
Nov 29 01:20:51 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e48 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:20:51 np0005539510 ceph-mon[77142]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 29 01:20:51 np0005539510 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Nov 29 01:20:51 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:51 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:51 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:51 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.vmptkp", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Nov 29 01:20:51 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.vmptkp", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Nov 29 01:20:51 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:52 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e49 e49: 3 total, 3 up, 3 in
Nov 29 01:20:52 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"} v 0) v1
Nov 29 01:20:52 np0005539510 ceph-mon[77142]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/1290272359' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Nov 29 01:20:52 np0005539510 ceph-mon[77142]: Deploying daemon rgw.rgw.compute-0.vmptkp on compute-0
Nov 29 01:20:52 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.102:0/1290272359' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Nov 29 01:20:52 np0005539510 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Nov 29 01:20:52 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.101:0/1253186838' entity='client.rgw.rgw.compute-1.cbugbv' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Nov 29 01:20:52 np0005539510 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-1.cbugbv' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Nov 29 01:20:53 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e50 e50: 3 total, 3 up, 3 in
Nov 29 01:20:54 np0005539510 ceph-mon[77142]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Nov 29 01:20:54 np0005539510 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Nov 29 01:20:54 np0005539510 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-1.cbugbv' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Nov 29 01:20:55 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e51 e51: 3 total, 3 up, 3 in
Nov 29 01:20:55 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.e deep-scrub starts
Nov 29 01:20:56 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"} v 0) v1
Nov 29 01:20:56 np0005539510 ceph-mon[77142]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/2594248517' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Nov 29 01:20:56 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.e deep-scrub ok
Nov 29 01:20:56 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:20:57 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e52 e52: 3 total, 3 up, 3 in
Nov 29 01:20:57 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:57 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.101:0/111233770' entity='client.rgw.rgw.compute-1.cbugbv' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Nov 29 01:20:57 np0005539510 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-1.cbugbv' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Nov 29 01:20:57 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/49466279' entity='client.rgw.rgw.compute-0.vmptkp' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Nov 29 01:20:57 np0005539510 podman[83679]: 2025-11-29 06:20:57.969004908 +0000 UTC m=+0.057640653 container create a9d2204622a86adfa6b6d5dc61450cdeca4e5ce2d24753b471502efc1f283565 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_germain, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 01:20:58 np0005539510 podman[83679]: 2025-11-29 06:20:57.944300614 +0000 UTC m=+0.032936439 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:20:58 np0005539510 systemd[1]: Started libpod-conmon-a9d2204622a86adfa6b6d5dc61450cdeca4e5ce2d24753b471502efc1f283565.scope.
Nov 29 01:20:58 np0005539510 systemd[1]: Started libcrun container.
Nov 29 01:20:58 np0005539510 podman[83679]: 2025-11-29 06:20:58.133434274 +0000 UTC m=+0.222070069 container init a9d2204622a86adfa6b6d5dc61450cdeca4e5ce2d24753b471502efc1f283565 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_germain, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 01:20:58 np0005539510 podman[83679]: 2025-11-29 06:20:58.139939023 +0000 UTC m=+0.228574768 container start a9d2204622a86adfa6b6d5dc61450cdeca4e5ce2d24753b471502efc1f283565 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_germain, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, ceph=True)
Nov 29 01:20:58 np0005539510 podman[83679]: 2025-11-29 06:20:58.14366473 +0000 UTC m=+0.232300515 container attach a9d2204622a86adfa6b6d5dc61450cdeca4e5ce2d24753b471502efc1f283565 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_germain, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:20:58 np0005539510 flamboyant_germain[83696]: 167 167
Nov 29 01:20:58 np0005539510 systemd[1]: libpod-a9d2204622a86adfa6b6d5dc61450cdeca4e5ce2d24753b471502efc1f283565.scope: Deactivated successfully.
Nov 29 01:20:58 np0005539510 podman[83679]: 2025-11-29 06:20:58.145114318 +0000 UTC m=+0.233750073 container died a9d2204622a86adfa6b6d5dc61450cdeca4e5ce2d24753b471502efc1f283565 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_germain, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 01:20:58 np0005539510 systemd[1]: var-lib-containers-storage-overlay-9611f9f647e39f6db2f9a13c55c0509de63dff63545797d591e76033d4ff9795-merged.mount: Deactivated successfully.
Nov 29 01:20:58 np0005539510 podman[83679]: 2025-11-29 06:20:58.187893313 +0000 UTC m=+0.276529098 container remove a9d2204622a86adfa6b6d5dc61450cdeca4e5ce2d24753b471502efc1f283565 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_germain, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Nov 29 01:20:58 np0005539510 systemd[1]: libpod-conmon-a9d2204622a86adfa6b6d5dc61450cdeca4e5ce2d24753b471502efc1f283565.scope: Deactivated successfully.
Nov 29 01:20:58 np0005539510 systemd[1]: Reloading.
Nov 29 01:20:58 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:20:58 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:20:58 np0005539510 ceph-mon[77142]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 29 01:20:58 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.102:0/2594248517' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Nov 29 01:20:58 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:58 np0005539510 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Nov 29 01:20:58 np0005539510 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-1.cbugbv' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Nov 29 01:20:58 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/49466279' entity='client.rgw.rgw.compute-0.vmptkp' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Nov 29 01:20:58 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.101:0/111233770' entity='client.rgw.rgw.compute-1.cbugbv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Nov 29 01:20:58 np0005539510 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-1.cbugbv' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Nov 29 01:20:58 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/49466279' entity='client.rgw.rgw.compute-0.vmptkp' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Nov 29 01:20:58 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:58 np0005539510 ceph-mon[77142]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Nov 29 01:20:58 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:58 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:58 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.gxdwyy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Nov 29 01:20:58 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.gxdwyy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Nov 29 01:20:58 np0005539510 ceph-mon[77142]: Deploying daemon mds.cephfs.compute-2.gxdwyy on compute-2
Nov 29 01:20:58 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:20:58 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e53 e53: 3 total, 3 up, 3 in
Nov 29 01:20:58 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"} v 0) v1
Nov 29 01:20:58 np0005539510 ceph-mon[77142]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/2594248517' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Nov 29 01:20:58 np0005539510 systemd[1]: Reloading.
Nov 29 01:20:58 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:20:58 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:20:58 np0005539510 systemd[1]: Starting Ceph mds.cephfs.compute-2.gxdwyy for 336ec58c-893b-528f-a0c1-6ed1196bc047...
Nov 29 01:20:59 np0005539510 podman[83841]: 2025-11-29 06:20:59.018956475 +0000 UTC m=+0.047474218 container create 4b521558281d5bbdb1d02047f26ae0e54524f34d720616e22f8053cc3b28d2f9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mds-cephfs-compute-2-gxdwyy, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 01:20:59 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/623a962ffe328ef92a00b93b176f643a6f75281f8d58dd21e84de036e3e2c508/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 01:20:59 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/623a962ffe328ef92a00b93b176f643a6f75281f8d58dd21e84de036e3e2c508/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:20:59 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/623a962ffe328ef92a00b93b176f643a6f75281f8d58dd21e84de036e3e2c508/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 01:20:59 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/623a962ffe328ef92a00b93b176f643a6f75281f8d58dd21e84de036e3e2c508/merged/var/lib/ceph/mds/ceph-cephfs.compute-2.gxdwyy supports timestamps until 2038 (0x7fffffff)
Nov 29 01:20:59 np0005539510 podman[83841]: 2025-11-29 06:20:59.087589604 +0000 UTC m=+0.116107367 container init 4b521558281d5bbdb1d02047f26ae0e54524f34d720616e22f8053cc3b28d2f9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mds-cephfs-compute-2-gxdwyy, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 01:20:59 np0005539510 podman[83841]: 2025-11-29 06:20:59.092870652 +0000 UTC m=+0.121388385 container start 4b521558281d5bbdb1d02047f26ae0e54524f34d720616e22f8053cc3b28d2f9 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mds-cephfs-compute-2-gxdwyy, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 01:20:59 np0005539510 podman[83841]: 2025-11-29 06:20:58.99995824 +0000 UTC m=+0.028475993 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:20:59 np0005539510 bash[83841]: 4b521558281d5bbdb1d02047f26ae0e54524f34d720616e22f8053cc3b28d2f9
Nov 29 01:20:59 np0005539510 systemd[1]: Started Ceph mds.cephfs.compute-2.gxdwyy for 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 01:20:59 np0005539510 ceph-mds[83861]: set uid:gid to 167:167 (ceph:ceph)
Nov 29 01:20:59 np0005539510 ceph-mds[83861]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mds, pid 2
Nov 29 01:20:59 np0005539510 ceph-mds[83861]: main not setting numa affinity
Nov 29 01:20:59 np0005539510 ceph-mds[83861]: pidfile_write: ignore empty --pid-file
Nov 29 01:20:59 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mds-cephfs-compute-2-gxdwyy[83857]: starting mds.cephfs.compute-2.gxdwyy at 
Nov 29 01:20:59 np0005539510 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy Updating MDS map to version 2 from mon.1
Nov 29 01:21:00 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).mds e3 new map
Nov 29 01:21:00 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).mds e3 print_map#012e3#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-29T06:19:35.588785+0000#012modified#0112025-11-29T06:19:35.589013+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-2.gxdwyy{-1:24145} state up:standby seq 1 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]
Nov 29 01:21:00 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e54 e54: 3 total, 3 up, 3 in
Nov 29 01:21:00 np0005539510 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy Updating MDS map to version 3 from mon.1
Nov 29 01:21:00 np0005539510 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy Monitors have assigned me to become a standby.
Nov 29 01:21:00 np0005539510 ceph-mon[77142]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Nov 29 01:21:00 np0005539510 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Nov 29 01:21:00 np0005539510 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-1.cbugbv' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Nov 29 01:21:00 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.100:0/49466279' entity='client.rgw.rgw.compute-0.vmptkp' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Nov 29 01:21:00 np0005539510 ceph-mon[77142]: from='client.? 192.168.122.102:0/2594248517' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Nov 29 01:21:00 np0005539510 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Nov 29 01:21:00 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).mds e4 new map
Nov 29 01:21:00 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).mds e4 print_map#012e4#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0114#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-29T06:19:35.588785+0000#012modified#0112025-11-29T06:21:00.645745+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24145}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012[mds.cephfs.compute-2.gxdwyy{0:24145} state up:creating seq 1 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]#012 #012 
Nov 29 01:21:00 np0005539510 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy Updating MDS map to version 4 from mon.1
Nov 29 01:21:00 np0005539510 ceph-mds[83861]: mds.0.4 handle_mds_map i am now mds.0.4
Nov 29 01:21:00 np0005539510 ceph-mds[83861]: mds.0.4 handle_mds_map state change up:standby --> up:creating
Nov 29 01:21:00 np0005539510 ceph-mds[83861]: mds.0.cache creating system inode with ino:0x1
Nov 29 01:21:00 np0005539510 ceph-mds[83861]: mds.0.cache creating system inode with ino:0x100
Nov 29 01:21:00 np0005539510 ceph-mds[83861]: mds.0.cache creating system inode with ino:0x600
Nov 29 01:21:00 np0005539510 ceph-mds[83861]: mds.0.cache creating system inode with ino:0x601
Nov 29 01:21:00 np0005539510 ceph-mds[83861]: mds.0.cache creating system inode with ino:0x602
Nov 29 01:21:00 np0005539510 ceph-mds[83861]: mds.0.cache creating system inode with ino:0x603
Nov 29 01:21:00 np0005539510 ceph-mds[83861]: mds.0.cache creating system inode with ino:0x604
Nov 29 01:21:00 np0005539510 ceph-mds[83861]: mds.0.cache creating system inode with ino:0x605
Nov 29 01:21:00 np0005539510 ceph-mds[83861]: mds.0.cache creating system inode with ino:0x606
Nov 29 01:21:00 np0005539510 ceph-mds[83861]: mds.0.cache creating system inode with ino:0x607
Nov 29 01:21:00 np0005539510 ceph-mds[83861]: mds.0.cache creating system inode with ino:0x608
Nov 29 01:21:00 np0005539510 ceph-mds[83861]: mds.0.cache creating system inode with ino:0x609
Nov 29 01:21:00 np0005539510 ceph-mds[83861]: mds.0.4 creating_done
Nov 29 01:21:00 np0005539510 radosgw[83467]: LDAP not started since no server URIs were provided in the configuration.
Nov 29 01:21:00 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-rgw-rgw-compute-2-pkypgd[83463]: 2025-11-29T06:21:00.876+0000 7fdb615a0940 -1 LDAP not started since no server URIs were provided in the configuration.
Nov 29 01:21:00 np0005539510 radosgw[83467]: framework: beast
Nov 29 01:21:00 np0005539510 radosgw[83467]: framework conf key: ssl_certificate, val: config://rgw/cert/$realm/$zone.crt
Nov 29 01:21:00 np0005539510 radosgw[83467]: framework conf key: ssl_private_key, val: config://rgw/cert/$realm/$zone.key
Nov 29 01:21:00 np0005539510 radosgw[83467]: starting handler: beast
Nov 29 01:21:00 np0005539510 radosgw[83467]: set uid:gid to 167:167 (ceph:ceph)
Nov 29 01:21:01 np0005539510 radosgw[83467]: mgrc service_daemon_register rgw.24133 metadata {arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),ceph_version_short=18.2.7,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0,cpu=AMD EPYC-Rome Processor,distro=centos,distro_description=CentOS Stream 9,distro_version=9,frontend_config#0=beast endpoint=192.168.122.102:8082,frontend_type#0=beast,hostname=compute-2,id=rgw.compute-2.pkypgd,kernel_description=#1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025,kernel_version=5.14.0-642.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864320,num_handles=1,os=Linux,pid=2,realm_id=,realm_name=,zone_id=916ce3c8-b215-47fd-909b-03c5b552b52f,zone_name=default,zonegroup_id=a7fe8251-a74c-4f06-a680-d530d14bb192,zonegroup_name=default}
Nov 29 01:21:01 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e54 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:21:02 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:02 np0005539510 ceph-mon[77142]: from='client.? ' entity='client.rgw.rgw.compute-2.pkypgd' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Nov 29 01:21:02 np0005539510 ceph-mon[77142]: daemon mds.cephfs.compute-2.gxdwyy assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Nov 29 01:21:02 np0005539510 ceph-mon[77142]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Nov 29 01:21:02 np0005539510 ceph-mon[77142]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Nov 29 01:21:02 np0005539510 ceph-mon[77142]: Cluster is now healthy
Nov 29 01:21:02 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:02 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:02 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.jzycnf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Nov 29 01:21:02 np0005539510 ceph-mon[77142]: daemon mds.cephfs.compute-2.gxdwyy is now active in filesystem cephfs as rank 0
Nov 29 01:21:02 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.jzycnf", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Nov 29 01:21:02 np0005539510 ceph-mon[77142]: Deploying daemon mds.cephfs.compute-0.jzycnf on compute-0
Nov 29 01:21:03 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).mds e5 new map
Nov 29 01:21:03 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).mds e5 print_map#012e5#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-29T06:19:35.588785+0000#012modified#0112025-11-29T06:21:01.949294+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24145}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012[mds.cephfs.compute-2.gxdwyy{0:24145} state up:active seq 2 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]#012 #012 
Nov 29 01:21:03 np0005539510 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy Updating MDS map to version 5 from mon.1
Nov 29 01:21:03 np0005539510 ceph-mds[83861]: mds.0.4 handle_mds_map i am now mds.0.4
Nov 29 01:21:03 np0005539510 ceph-mds[83861]: mds.0.4 handle_mds_map state change up:creating --> up:active
Nov 29 01:21:03 np0005539510 ceph-mds[83861]: mds.0.4 recovery_done -- successful recovery!
Nov 29 01:21:03 np0005539510 ceph-mds[83861]: mds.0.4 active_start
Nov 29 01:21:05 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.12 deep-scrub starts
Nov 29 01:21:05 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.12 deep-scrub ok
Nov 29 01:21:05 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).mds e6 new map
Nov 29 01:21:05 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).mds e6 print_map#012e6#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-29T06:19:35.588785+0000#012modified#0112025-11-29T06:21:01.949294+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24145}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012[mds.cephfs.compute-2.gxdwyy{0:24145} state up:active seq 2 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.jzycnf{-1:14409} state up:standby seq 1 addr [v2:192.168.122.100:6806/3521074432,v1:192.168.122.100:6807/3521074432] compat {c=[1],r=[1],i=[7ff]}]
Nov 29 01:21:06 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).mds e7 new map
Nov 29 01:21:06 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).mds e7 print_map#012e7#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-29T06:19:35.588785+0000#012modified#0112025-11-29T06:21:01.949294+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24145}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.gxdwyy{0:24145} state up:active seq 2 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.jzycnf{-1:14409} state up:standby seq 1 addr [v2:192.168.122.100:6806/3521074432,v1:192.168.122.100:6807/3521074432] compat {c=[1],r=[1],i=[7ff]}]
Nov 29 01:21:06 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e54 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:21:07 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e55 e55: 3 total, 3 up, 3 in
Nov 29 01:21:07 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 01:21:07 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:07 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:07 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:07 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.vlqnad", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Nov 29 01:21:07 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.vlqnad", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Nov 29 01:21:07 np0005539510 ceph-mon[77142]: Deploying daemon mds.cephfs.compute-1.vlqnad on compute-1
Nov 29 01:21:08 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e56 e56: 3 total, 3 up, 3 in
Nov 29 01:21:08 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]': finished
Nov 29 01:21:08 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 01:21:08 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:08 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:21:08 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]': finished
Nov 29 01:21:08 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:21:08 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 01:21:08 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 2.13 scrub starts
Nov 29 01:21:08 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 2.13 scrub ok
Nov 29 01:21:11 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e57 e57: 3 total, 3 up, 3 in
Nov 29 01:21:11 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e57 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:21:12 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.11 scrub starts
Nov 29 01:21:12 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.11 scrub ok
Nov 29 01:21:12 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).mds e8 new map
Nov 29 01:21:12 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).mds e8 print_map#012e8#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-29T06:19:35.588785+0000#012modified#0112025-11-29T06:21:01.949294+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24145}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.gxdwyy{0:24145} state up:active seq 2 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.jzycnf{-1:14409} state up:standby seq 1 addr [v2:192.168.122.100:6806/3521074432,v1:192.168.122.100:6807/3521074432] compat {c=[1],r=[1],i=[7ff]}]#012[mds.cephfs.compute-1.vlqnad{-1:24131} state up:standby seq 1 addr [v2:192.168.122.101:6804/3552238207,v1:192.168.122.101:6805/3552238207] compat {c=[1],r=[1],i=[7ff]}]
Nov 29 01:21:12 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e58 e58: 3 total, 3 up, 3 in
Nov 29 01:21:12 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:21:12 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]': finished
Nov 29 01:21:12 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:12 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Nov 29 01:21:12 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:21:12 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:21:14 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e59 e59: 3 total, 3 up, 3 in
Nov 29 01:21:14 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:21:14 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]': finished
Nov 29 01:21:14 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:21:14 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:21:14 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:14 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:15 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.8 scrub starts
Nov 29 01:21:15 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.8 scrub ok
Nov 29 01:21:16 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:16 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:21:16 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:16 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:16 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:16 np0005539510 ceph-mon[77142]: Deploying daemon haproxy.rgw.default.compute-0.zzbnoj on compute-0
Nov 29 01:21:16 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.0 scrub starts
Nov 29 01:21:16 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.0 scrub ok
Nov 29 01:21:16 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e60 e60: 3 total, 3 up, 3 in
Nov 29 01:21:16 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:21:17 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 29 01:21:17 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:21:17 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).mds e9 new map
Nov 29 01:21:17 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).mds e9 print_map#012e9#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0119#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-29T06:19:35.588785+0000#012modified#0112025-11-29T06:21:17.214295+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24145}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.gxdwyy{0:24145} state up:active seq 6 join_fscid=1 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.jzycnf{-1:14409} state up:standby seq 4 join_fscid=1 addr [v2:192.168.122.100:6806/3521074432,v1:192.168.122.100:6807/3521074432] compat {c=[1],r=[1],i=[7ff]}]#012[mds.cephfs.compute-1.vlqnad{-1:24131} state up:standby seq 1 addr [v2:192.168.122.101:6804/3552238207,v1:192.168.122.101:6805/3552238207] compat {c=[1],r=[1],i=[7ff]}]
Nov 29 01:21:17 np0005539510 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy Updating MDS map to version 9 from mon.1
Nov 29 01:21:18 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e61 e61: 3 total, 3 up, 3 in
Nov 29 01:21:18 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Nov 29 01:21:20 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.0 deep-scrub starts
Nov 29 01:21:20 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.0 deep-scrub ok
Nov 29 01:21:21 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).mds e10 new map
Nov 29 01:21:21 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).mds e10 print_map#012e10#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0119#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-29T06:19:35.588785+0000#012modified#0112025-11-29T06:21:17.214295+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24145}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.gxdwyy{0:24145} state up:active seq 6 join_fscid=1 addr [v2:192.168.122.102:6804/1811763433,v1:192.168.122.102:6805/1811763433] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.jzycnf{-1:14409} state up:standby seq 4 join_fscid=1 addr [v2:192.168.122.100:6806/3521074432,v1:192.168.122.100:6807/3521074432] compat {c=[1],r=[1],i=[7ff]}]#012[mds.cephfs.compute-1.vlqnad{-1:24131} state up:standby seq 3 join_fscid=1 addr [v2:192.168.122.101:6804/3552238207,v1:192.168.122.101:6805/3552238207] compat {c=[1],r=[1],i=[7ff]}]
Nov 29 01:21:21 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e61 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:21:23 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 01:21:23 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 01:21:23 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Nov 29 01:21:23 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.d deep-scrub starts
Nov 29 01:21:24 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e62 e62: 3 total, 3 up, 3 in
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[11.3( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[10.4( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[8.6( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[10.f( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[11.e( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[8.d( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[11.8( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[8.b( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[11.16( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[8.15( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[8.2( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[10.10( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[8.5( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[8.16( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[8.9( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[11.a( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[8.f( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[10.3( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[11.19( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[8.a( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[10.11( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[8.11( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[10.1( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[8.3( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[8.1f( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[10.12( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[11.13( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[10.1e( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[8.1c( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[11.17( empty local-lis/les=0/0 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:24 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 62 pg[8.c( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:25 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.d deep-scrub ok
Nov 29 01:21:25 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 01:21:25 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 01:21:25 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Nov 29 01:21:25 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 01:21:25 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 01:21:25 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 01:21:25 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Nov 29 01:21:25 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 01:21:25 np0005539510 systemd-logind[784]: New session 33 of user zuul.
Nov 29 01:21:25 np0005539510 systemd[1]: Started Session 33 of User zuul.
Nov 29 01:21:25 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e63 e63: 3 total, 3 up, 3 in
Nov 29 01:21:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:21:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.002000052s ======
Nov 29 01:21:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:26.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000052s
Nov 29 01:21:26 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.b scrub starts
Nov 29 01:21:26 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.b scrub ok
Nov 29 01:21:26 np0005539510 python3.9[84687]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:21:26 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e63 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:21:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:21:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:28.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:28 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 01:21:28 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 01:21:28 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Nov 29 01:21:28 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 01:21:28 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:28 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:28 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:28 np0005539510 python3.9[84958]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:21:29 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e64 e64: 3 total, 3 up, 3 in
Nov 29 01:21:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[10.10( v 54'96 (0'0,54'96] local-lis/les=62/64 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[10.4( v 54'96 (0'0,54'96] local-lis/les=62/64 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[10.f( v 54'96 (0'0,54'96] local-lis/les=62/64 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[10.1( v 54'96 (0'0,54'96] local-lis/les=62/64 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[10.12( v 54'96 (0'0,54'96] local-lis/les=62/64 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[10.11( v 54'96 (0'0,54'96] local-lis/les=62/64 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[8.c( v 46'4 (0'0,46'4] local-lis/les=62/64 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[10.3( v 59'99 lc 54'84 (0'0,59'99] local-lis/les=62/64 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=59'99 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[10.1e( v 54'96 (0'0,54'96] local-lis/les=62/64 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=62) [2] r=0 lpr=62 pi=[58,62)/1 crt=54'96 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[8.1c( v 46'4 (0'0,46'4] local-lis/les=62/64 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[8.1f( v 46'4 (0'0,46'4] local-lis/les=62/64 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[11.13( v 54'2 (0'0,54'2] local-lis/les=62/64 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[8.3( v 46'4 (0'0,46'4] local-lis/les=62/64 n=1 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[8.11( v 46'4 (0'0,46'4] local-lis/les=62/64 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[11.17( v 54'2 (0'0,54'2] local-lis/les=62/64 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[8.a( v 46'4 (0'0,46'4] local-lis/les=62/64 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[11.19( v 54'2 (0'0,54'2] local-lis/les=62/64 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[11.a( v 54'2 (0'0,54'2] local-lis/les=62/64 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[8.f( v 46'4 lc 0'0 (0'0,46'4] local-lis/les=62/64 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=46'4 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[8.5( v 46'4 (0'0,46'4] local-lis/les=62/64 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[8.9( v 46'4 (0'0,46'4] local-lis/les=62/64 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[8.16( v 46'4 (0'0,46'4] local-lis/les=62/64 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[8.15( v 46'4 (0'0,46'4] local-lis/les=62/64 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[8.2( v 46'4 (0'0,46'4] local-lis/les=62/64 n=1 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[8.b( v 46'4 (0'0,46'4] local-lis/les=62/64 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[11.16( v 54'2 (0'0,54'2] local-lis/les=62/64 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[8.d( v 46'4 (0'0,46'4] local-lis/les=62/64 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[11.e( v 54'2 (0'0,54'2] local-lis/les=62/64 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[8.6( v 46'4 (0'0,46'4] local-lis/les=62/64 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2] r=0 lpr=62 pi=[56,62)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[11.3( v 54'2 (0'0,54'2] local-lis/les=62/64 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=54'2 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:29 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 64 pg[11.8( v 54'2 lc 0'0 (0'0,54'2] local-lis/les=62/64 n=0 ec=60/51 lis/c=60/60 les/c/f=61/61/0 sis=62) [2] r=0 lpr=62 pi=[60,62)/1 crt=54'2 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:29 np0005539510 ceph-mon[77142]: Deploying daemon haproxy.rgw.default.compute-2.lpqgfx on compute-2
Nov 29 01:21:29 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:21:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:30.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:32 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e64 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:21:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:21:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:21:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:32.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:21:32 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.8 scrub starts
Nov 29 01:21:32 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.8 scrub ok
Nov 29 01:21:33 np0005539510 podman[84737]: 2025-11-29 06:21:33.395221555 +0000 UTC m=+6.492017014 container create 30be406d416edb6851a537062e2707b0d10c07bebf90cdc5131873df0e751901 (image=quay.io/ceph/haproxy:2.3, name=goofy_sanderson)
Nov 29 01:21:33 np0005539510 systemd[1]: Started libpod-conmon-30be406d416edb6851a537062e2707b0d10c07bebf90cdc5131873df0e751901.scope.
Nov 29 01:21:33 np0005539510 podman[84737]: 2025-11-29 06:21:33.378476669 +0000 UTC m=+6.475272198 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Nov 29 01:21:33 np0005539510 systemd[1]: Started libcrun container.
Nov 29 01:21:33 np0005539510 podman[84737]: 2025-11-29 06:21:33.516796364 +0000 UTC m=+6.613591833 container init 30be406d416edb6851a537062e2707b0d10c07bebf90cdc5131873df0e751901 (image=quay.io/ceph/haproxy:2.3, name=goofy_sanderson)
Nov 29 01:21:33 np0005539510 podman[84737]: 2025-11-29 06:21:33.533869689 +0000 UTC m=+6.630665168 container start 30be406d416edb6851a537062e2707b0d10c07bebf90cdc5131873df0e751901 (image=quay.io/ceph/haproxy:2.3, name=goofy_sanderson)
Nov 29 01:21:33 np0005539510 podman[84737]: 2025-11-29 06:21:33.538037088 +0000 UTC m=+6.634832537 container attach 30be406d416edb6851a537062e2707b0d10c07bebf90cdc5131873df0e751901 (image=quay.io/ceph/haproxy:2.3, name=goofy_sanderson)
Nov 29 01:21:33 np0005539510 goofy_sanderson[85078]: 0 0
Nov 29 01:21:33 np0005539510 systemd[1]: libpod-30be406d416edb6851a537062e2707b0d10c07bebf90cdc5131873df0e751901.scope: Deactivated successfully.
Nov 29 01:21:33 np0005539510 podman[84737]: 2025-11-29 06:21:33.544511737 +0000 UTC m=+6.641307216 container died 30be406d416edb6851a537062e2707b0d10c07bebf90cdc5131873df0e751901 (image=quay.io/ceph/haproxy:2.3, name=goofy_sanderson)
Nov 29 01:21:33 np0005539510 systemd[1]: var-lib-containers-storage-overlay-b2aa4391ada0a5dd4a07574f11fd1e65c1794862a3f8f75f8b6449d4da86dc32-merged.mount: Deactivated successfully.
Nov 29 01:21:33 np0005539510 podman[84737]: 2025-11-29 06:21:33.599004627 +0000 UTC m=+6.695800096 container remove 30be406d416edb6851a537062e2707b0d10c07bebf90cdc5131873df0e751901 (image=quay.io/ceph/haproxy:2.3, name=goofy_sanderson)
Nov 29 01:21:33 np0005539510 systemd[1]: libpod-conmon-30be406d416edb6851a537062e2707b0d10c07bebf90cdc5131873df0e751901.scope: Deactivated successfully.
Nov 29 01:21:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:21:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:21:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:34.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:21:34 np0005539510 systemd[1]: Reloading.
Nov 29 01:21:34 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:21:34 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:21:34 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.15 scrub starts
Nov 29 01:21:34 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.15 scrub ok
Nov 29 01:21:34 np0005539510 systemd[1]: Reloading.
Nov 29 01:21:34 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:21:34 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:21:34 np0005539510 systemd[1]: Starting Ceph haproxy.rgw.default.compute-2.lpqgfx for 336ec58c-893b-528f-a0c1-6ed1196bc047...
Nov 29 01:21:35 np0005539510 podman[85231]: 2025-11-29 06:21:35.202461671 +0000 UTC m=+0.024800177 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Nov 29 01:21:35 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e65 e65: 3 total, 3 up, 3 in
Nov 29 01:21:35 np0005539510 podman[85231]: 2025-11-29 06:21:35.405236807 +0000 UTC m=+0.227575293 container create e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 01:21:35 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1e077d830398853e06c0755f719428ef7611a0c98d04678e5362d34dcac8c5a5/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Nov 29 01:21:35 np0005539510 podman[85231]: 2025-11-29 06:21:35.496653779 +0000 UTC m=+0.318992325 container init e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 01:21:35 np0005539510 podman[85231]: 2025-11-29 06:21:35.502186934 +0000 UTC m=+0.324525440 container start e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 01:21:35 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx[85246]: [NOTICE] 332/062135 (2) : New worker #1 (4) forked
Nov 29 01:21:35 np0005539510 bash[85231]: e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629
Nov 29 01:21:35 np0005539510 systemd[1]: Started Ceph haproxy.rgw.default.compute-2.lpqgfx for 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 01:21:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:21:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:36.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:21:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:21:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:36.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:21:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:21:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:21:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:38.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:21:38 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e65 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:21:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:21:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:38.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:39 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e66 e66: 3 total, 3 up, 3 in
Nov 29 01:21:39 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Nov 29 01:21:39 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]': finished
Nov 29 01:21:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:21:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:21:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:40.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:21:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:21:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 01:21:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:40.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 01:21:41 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 66 pg[9.1f( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=66) [2] r=0 lpr=66 pi=[58,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:41 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 66 pg[9.7( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=66) [2] r=0 lpr=66 pi=[58,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:41 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 66 pg[9.13( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=66) [2] r=0 lpr=66 pi=[58,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:41 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 66 pg[9.f( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=66) [2] r=0 lpr=66 pi=[58,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:41 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 66 pg[9.1b( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=66) [2] r=0 lpr=66 pi=[58,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:41 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 66 pg[9.b( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=66) [2] r=0 lpr=66 pi=[58,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:41 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 66 pg[9.3( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=66) [2] r=0 lpr=66 pi=[58,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:41 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 66 pg[9.17( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=66) [2] r=0 lpr=66 pi=[58,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:41 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.13 scrub starts
Nov 29 01:21:41 np0005539510 systemd[1]: session-33.scope: Deactivated successfully.
Nov 29 01:21:41 np0005539510 systemd[1]: session-33.scope: Consumed 9.673s CPU time.
Nov 29 01:21:41 np0005539510 systemd-logind[784]: Session 33 logged out. Waiting for processes to exit.
Nov 29 01:21:41 np0005539510 systemd-logind[784]: Removed session 33.
Nov 29 01:21:42 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.13 scrub ok
Nov 29 01:21:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:21:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:21:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:42.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:21:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:21:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:42.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:43 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e66 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:21:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:21:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:21:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:44.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:21:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:21:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:44.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:45 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e67 e67: 3 total, 3 up, 3 in
Nov 29 01:21:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:21:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:46.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:46 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Nov 29 01:21:46 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:46 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Nov 29 01:21:46 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Nov 29 01:21:46 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Nov 29 01:21:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:21:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:46.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:47 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.e scrub starts
Nov 29 01:21:47 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.e scrub ok
Nov 29 01:21:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:21:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 01:21:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:48.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 01:21:48 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e68 e68: 3 total, 3 up, 3 in
Nov 29 01:21:48 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Nov 29 01:21:48 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:48 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Nov 29 01:21:48 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Nov 29 01:21:48 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:48 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.1f( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:48 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.1f( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:21:48 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.b( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:48 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.b( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:21:48 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.1b( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:48 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.1b( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:21:48 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.17( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:48 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.17( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:21:48 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.13( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:48 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.13( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:21:48 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.3( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:48 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.3( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:21:48 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.f( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:48 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.f( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:21:48 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.7( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:48 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 68 pg[9.7( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=68) [2]/[1] r=-1 lpr=68 pi=[58,68)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:21:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:21:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:48.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:48 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e68 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:21:49 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.1a scrub starts
Nov 29 01:21:49 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 5.1a scrub ok
Nov 29 01:21:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:21:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:50.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:50 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.1d scrub starts
Nov 29 01:21:50 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.1d scrub ok
Nov 29 01:21:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:21:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:50.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:51 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e69 e69: 3 total, 3 up, 3 in
Nov 29 01:21:51 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:51 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]': finished
Nov 29 01:21:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:21:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:52.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:21:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 01:21:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:52.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 01:21:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:21:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:54.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:54 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e69 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:21:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:21:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:54.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:55 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:21:55 np0005539510 ceph-mon[77142]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Nov 29 01:21:55 np0005539510 ceph-mon[77142]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Nov 29 01:21:55 np0005539510 ceph-mon[77142]: Deploying daemon keepalived.rgw.default.compute-2.klqjoa on compute-2
Nov 29 01:21:55 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e70 e70: 3 total, 3 up, 3 in
Nov 29 01:21:55 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 70 pg[9.17( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=70) [2] r=0 lpr=70 pi=[58,70)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:55 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 70 pg[9.17( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=70) [2] r=0 lpr=70 pi=[58,70)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:21:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:56.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:56 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e71 e71: 3 total, 3 up, 3 in
Nov 29 01:21:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:21:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:56.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:56 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 71 pg[9.7( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:56 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 71 pg[9.13( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:56 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 71 pg[9.7( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:56 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 71 pg[9.3( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:56 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 71 pg[9.1b( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:56 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 71 pg[9.1b( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:56 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 71 pg[9.b( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:56 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 71 pg[9.13( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:56 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 71 pg[9.b( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:56 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 71 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:56 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 71 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:56 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 71 pg[9.3( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:56 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 71 pg[9.f( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:21:56 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 71 pg[9.f( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:21:56 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 71 pg[9.17( v 56'1130 (0'0,56'1130] local-lis/les=70/71 n=5 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=70) [2] r=0 lpr=70 pi=[58,70)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:57 np0005539510 systemd-logind[784]: New session 34 of user zuul.
Nov 29 01:21:57 np0005539510 systemd[1]: Started Session 34 of User zuul.
Nov 29 01:21:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:21:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:21:58.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:58 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.1b scrub starts
Nov 29 01:21:58 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 3.1b scrub ok
Nov 29 01:21:58 np0005539510 python3.9[85652]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 29 01:21:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:21:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:21:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:21:58.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:21:59 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e72 e72: 3 total, 3 up, 3 in
Nov 29 01:21:59 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 72 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=5 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:59 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 72 pg[9.b( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=6 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:59 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e72 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:21:59 np0005539510 python3.9[85841]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:21:59 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 72 pg[9.f( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=6 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:59 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 72 pg[9.7( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=6 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:59 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 72 pg[9.13( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=5 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:59 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 72 pg[9.3( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=6 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:21:59 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 72 pg[9.1b( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=5 ec=58/47 lis/c=68/58 les/c/f=69/59/0 sis=71) [2] r=0 lpr=71 pi=[58,71)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:22:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 01:22:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:00.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 01:22:00 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 4.19 scrub starts
Nov 29 01:22:00 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 4.19 scrub ok
Nov 29 01:22:00 np0005539510 podman[85434]: 2025-11-29 06:22:00.825529697 +0000 UTC m=+10.288389575 container create b00fcfe213095648eff678e58cd24e0d40ae9083f5483819748603a550e4c1bf (image=quay.io/ceph/keepalived:2.2.4, name=clever_pike, com.redhat.component=keepalived-container, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, release=1793, io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, name=keepalived, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, distribution-scope=public, io.buildah.version=1.28.2)
Nov 29 01:22:00 np0005539510 systemd[1]: Started libpod-conmon-b00fcfe213095648eff678e58cd24e0d40ae9083f5483819748603a550e4c1bf.scope.
Nov 29 01:22:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:00.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:00 np0005539510 podman[85434]: 2025-11-29 06:22:00.806166755 +0000 UTC m=+10.269026633 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Nov 29 01:22:00 np0005539510 systemd[1]: Started libcrun container.
Nov 29 01:22:01 np0005539510 python3.9[86016]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:22:01 np0005539510 podman[85434]: 2025-11-29 06:22:01.675647722 +0000 UTC m=+11.138507620 container init b00fcfe213095648eff678e58cd24e0d40ae9083f5483819748603a550e4c1bf (image=quay.io/ceph/keepalived:2.2.4, name=clever_pike, com.redhat.component=keepalived-container, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, name=keepalived, architecture=x86_64, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, release=1793, version=2.2.4, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.28.2, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Nov 29 01:22:01 np0005539510 podman[85434]: 2025-11-29 06:22:01.685302037 +0000 UTC m=+11.148161925 container start b00fcfe213095648eff678e58cd24e0d40ae9083f5483819748603a550e4c1bf (image=quay.io/ceph/keepalived:2.2.4, name=clever_pike, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, name=keepalived, description=keepalived for Ceph, build-date=2023-02-22T09:23:20, release=1793, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.28.2, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, version=2.2.4)
Nov 29 01:22:01 np0005539510 clever_pike[85981]: 0 0
Nov 29 01:22:01 np0005539510 systemd[1]: libpod-b00fcfe213095648eff678e58cd24e0d40ae9083f5483819748603a550e4c1bf.scope: Deactivated successfully.
Nov 29 01:22:02 np0005539510 podman[85434]: 2025-11-29 06:22:02.040149735 +0000 UTC m=+11.503009613 container attach b00fcfe213095648eff678e58cd24e0d40ae9083f5483819748603a550e4c1bf (image=quay.io/ceph/keepalived:2.2.4, name=clever_pike, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, com.redhat.component=keepalived-container, release=1793, io.openshift.tags=Ceph keepalived, io.buildah.version=1.28.2, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, version=2.2.4, distribution-scope=public, description=keepalived for Ceph, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, name=keepalived)
Nov 29 01:22:02 np0005539510 podman[85434]: 2025-11-29 06:22:02.041710283 +0000 UTC m=+11.504570171 container died b00fcfe213095648eff678e58cd24e0d40ae9083f5483819748603a550e4c1bf (image=quay.io/ceph/keepalived:2.2.4, name=clever_pike, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, com.redhat.component=keepalived-container, vendor=Red Hat, Inc., release=1793, io.openshift.tags=Ceph keepalived, version=2.2.4, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph.)
Nov 29 01:22:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:02.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:02 np0005539510 python3.9[86183]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:22:02 np0005539510 systemd[1]: var-lib-containers-storage-overlay-336a23a73ed125d91fb69388f54c6e326bced95e89e171120698e6c148a33348-merged.mount: Deactivated successfully.
Nov 29 01:22:02 np0005539510 podman[85434]: 2025-11-29 06:22:02.860731501 +0000 UTC m=+12.323591419 container remove b00fcfe213095648eff678e58cd24e0d40ae9083f5483819748603a550e4c1bf (image=quay.io/ceph/keepalived:2.2.4, name=clever_pike, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, distribution-scope=public, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, name=keepalived, build-date=2023-02-22T09:23:20, vendor=Red Hat, Inc., version=2.2.4, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 01:22:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:02.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:02 np0005539510 systemd[1]: libpod-conmon-b00fcfe213095648eff678e58cd24e0d40ae9083f5483819748603a550e4c1bf.scope: Deactivated successfully.
Nov 29 01:22:03 np0005539510 systemd[1]: Reloading.
Nov 29 01:22:03 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:22:03 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:22:03 np0005539510 systemd[1]: Reloading.
Nov 29 01:22:03 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:22:03 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:22:03 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 7.14 scrub starts
Nov 29 01:22:03 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 7.14 scrub ok
Nov 29 01:22:03 np0005539510 systemd[1]: Starting Ceph keepalived.rgw.default.compute-2.klqjoa for 336ec58c-893b-528f-a0c1-6ed1196bc047...
Nov 29 01:22:03 np0005539510 python3.9[86419]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:22:03 np0005539510 podman[86465]: 2025-11-29 06:22:03.733275192 +0000 UTC m=+0.021292719 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Nov 29 01:22:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:04.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 01:22:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:04.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 01:22:05 np0005539510 python3.9[86629]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:22:05 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 7.1d deep-scrub starts
Nov 29 01:22:05 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 7.1d deep-scrub ok
Nov 29 01:22:05 np0005539510 podman[86465]: 2025-11-29 06:22:05.634933754 +0000 UTC m=+1.922951261 container create d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, com.redhat.component=keepalived-container, release=1793, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.buildah.version=1.28.2, vcs-type=git, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4)
Nov 29 01:22:05 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e72 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:22:06 np0005539510 python3.9[86780]: ansible-ansible.builtin.service_facts Invoked
Nov 29 01:22:06 np0005539510 network[86797]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:22:06 np0005539510 network[86798]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:22:06 np0005539510 network[86799]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:22:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:22:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:06.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:22:06 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/225a3e51dd02ee070057392a07ad23b72c8c2d48a7d4402bcf59dce250c46a6c/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 01:22:06 np0005539510 podman[86465]: 2025-11-29 06:22:06.256619708 +0000 UTC m=+2.544637245 container init d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, version=2.2.4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=Ceph keepalived, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, io.buildah.version=1.28.2, io.openshift.expose-services=, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, release=1793, build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public)
Nov 29 01:22:06 np0005539510 podman[86465]: 2025-11-29 06:22:06.262544793 +0000 UTC m=+2.550562310 container start d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, distribution-scope=public, io.buildah.version=1.28.2, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, vendor=Red Hat, Inc., release=1793, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64)
Nov 29 01:22:06 np0005539510 bash[86465]: d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35
Nov 29 01:22:06 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa[86807]: Sat Nov 29 06:22:06 2025: Starting Keepalived v2.2.4 (08/21,2021)
Nov 29 01:22:06 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa[86807]: Sat Nov 29 06:22:06 2025: Running on Linux 5.14.0-642.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025 (built for Linux 5.14.0)
Nov 29 01:22:06 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa[86807]: Sat Nov 29 06:22:06 2025: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Nov 29 01:22:06 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa[86807]: Sat Nov 29 06:22:06 2025: Configuration file /etc/keepalived/keepalived.conf
Nov 29 01:22:06 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e73 e73: 3 total, 3 up, 3 in
Nov 29 01:22:06 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa[86807]: Sat Nov 29 06:22:06 2025: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Nov 29 01:22:06 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa[86807]: Sat Nov 29 06:22:06 2025: Starting VRRP child process, pid=4
Nov 29 01:22:06 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa[86807]: Sat Nov 29 06:22:06 2025: Startup complete
Nov 29 01:22:06 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa[86807]: Sat Nov 29 06:22:06 2025: (VI_0) Entering BACKUP STATE (init)
Nov 29 01:22:06 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 73 pg[9.15( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=73) [2] r=0 lpr=73 pi=[58,73)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:06 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 73 pg[9.d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=73) [2] r=0 lpr=73 pi=[58,73)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:06 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 73 pg[9.5( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=73) [2] r=0 lpr=73 pi=[58,73)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:06 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 73 pg[9.1d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=73) [2] r=0 lpr=73 pi=[58,73)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:06 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa[86807]: Sat Nov 29 06:22:06 2025: VRRP_Script(check_backend) succeeded
Nov 29 01:22:06 np0005539510 systemd[1]: Started Ceph keepalived.rgw.default.compute-2.klqjoa for 336ec58c-893b-528f-a0c1-6ed1196bc047.
Nov 29 01:22:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:06.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:06 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Nov 29 01:22:06 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Nov 29 01:22:06 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Nov 29 01:22:07 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e74 e74: 3 total, 3 up, 3 in
Nov 29 01:22:07 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 74 pg[9.5( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=74) [2]/[1] r=-1 lpr=74 pi=[58,74)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:07 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 74 pg[9.5( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=74) [2]/[1] r=-1 lpr=74 pi=[58,74)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:22:07 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 74 pg[9.1d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=74) [2]/[1] r=-1 lpr=74 pi=[58,74)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:07 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 74 pg[9.1d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=74) [2]/[1] r=-1 lpr=74 pi=[58,74)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:22:07 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 74 pg[9.d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=74) [2]/[1] r=-1 lpr=74 pi=[58,74)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:07 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 74 pg[9.15( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=74) [2]/[1] r=-1 lpr=74 pi=[58,74)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:07 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 74 pg[9.d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=74) [2]/[1] r=-1 lpr=74 pi=[58,74)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:22:07 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 74 pg[9.15( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=74) [2]/[1] r=-1 lpr=74 pi=[58,74)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:22:08 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:08 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:08 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Nov 29 01:22:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:08.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:08.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:09 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e75 e75: 3 total, 3 up, 3 in
Nov 29 01:22:09 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:09 np0005539510 ceph-mon[77142]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Nov 29 01:22:09 np0005539510 ceph-mon[77142]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Nov 29 01:22:09 np0005539510 ceph-mon[77142]: Deploying daemon keepalived.rgw.default.compute-0.uyqrbs on compute-0
Nov 29 01:22:09 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Nov 29 01:22:09 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]': finished
Nov 29 01:22:09 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa[86807]: Sat Nov 29 06:22:09 2025: (VI_0) Entering MASTER STATE
Nov 29 01:22:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:10.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:10 np0005539510 python3.9[87072]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:22:10 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e75 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:22:10 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e76 e76: 3 total, 3 up, 3 in
Nov 29 01:22:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:10.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:11 np0005539510 python3.9[87223]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:22:12 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:12 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Nov 29 01:22:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:12.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:12 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e77 e77: 3 total, 3 up, 3 in
Nov 29 01:22:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:12.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:13 np0005539510 python3.9[87378]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:22:13 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 77 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=74/58 les/c/f=75/59/0 sis=77) [2] r=0 lpr=77 pi=[58,77)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:13 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 77 pg[9.d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=74/58 les/c/f=75/59/0 sis=77) [2] r=0 lpr=77 pi=[58,77)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:13 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 77 pg[9.d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=74/58 les/c/f=75/59/0 sis=77) [2] r=0 lpr=77 pi=[58,77)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:13 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 77 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=74/58 les/c/f=75/59/0 sis=77) [2] r=0 lpr=77 pi=[58,77)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:13 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 77 pg[9.5( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=74/58 les/c/f=75/59/0 sis=77) [2] r=0 lpr=77 pi=[58,77)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:13 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 77 pg[9.5( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=74/58 les/c/f=75/59/0 sis=77) [2] r=0 lpr=77 pi=[58,77)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:13 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 77 pg[9.15( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=74/58 les/c/f=75/59/0 sis=77) [2] r=0 lpr=77 pi=[58,77)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:13 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 77 pg[9.15( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=74/58 les/c/f=75/59/0 sis=77) [2] r=0 lpr=77 pi=[58,77)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:13 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]': finished
Nov 29 01:22:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:14.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:14 np0005539510 python3.9[87536]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:22:14 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 7.5 scrub starts
Nov 29 01:22:14 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 7.5 scrub ok
Nov 29 01:22:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 01:22:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:14.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 01:22:15 np0005539510 python3.9[87621]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:22:15 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e77 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:22:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 01:22:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:16.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 01:22:16 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 7.a scrub starts
Nov 29 01:22:16 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 7.a scrub ok
Nov 29 01:22:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:16.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:17 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e78 e78: 3 total, 3 up, 3 in
Nov 29 01:22:17 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 7.16 scrub starts
Nov 29 01:22:17 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 7.16 scrub ok
Nov 29 01:22:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:18.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:18 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e79 e79: 3 total, 3 up, 3 in
Nov 29 01:22:18 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:18 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 79 pg[9.d( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=6 ec=58/47 lis/c=74/58 les/c/f=75/59/0 sis=77) [2] r=0 lpr=77 pi=[58,77)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:22:18 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 79 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=5 ec=58/47 lis/c=74/58 les/c/f=75/59/0 sis=77) [2] r=0 lpr=77 pi=[58,77)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:22:18 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 79 pg[9.15( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=5 ec=58/47 lis/c=74/58 les/c/f=75/59/0 sis=77) [2] r=0 lpr=77 pi=[58,77)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:22:18 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 7.1f scrub starts
Nov 29 01:22:18 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 79 pg[9.5( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=6 ec=58/47 lis/c=74/58 les/c/f=75/59/0 sis=77) [2] r=0 lpr=77 pi=[58,77)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:22:18 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 7.1f scrub ok
Nov 29 01:22:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:18.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:19 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 7.11 scrub starts
Nov 29 01:22:19 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 7.11 scrub ok
Nov 29 01:22:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:20.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:20 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa[86807]: Sat Nov 29 06:22:20 2025: (VI_0) Master received advert from 192.168.122.100 with higher priority 100, ours 90
Nov 29 01:22:20 np0005539510 ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa[86807]: Sat Nov 29 06:22:20 2025: (VI_0) Entering BACKUP STATE
Nov 29 01:22:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 01:22:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:20.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 01:22:21 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e79 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:22:21 np0005539510 podman[87957]: 2025-11-29 06:22:21.341349103 +0000 UTC m=+1.871605174 container exec 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 01:22:21 np0005539510 podman[87957]: 2025-11-29 06:22:21.442188227 +0000 UTC m=+1.972444268 container exec_died 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 29 01:22:22 np0005539510 podman[88109]: 2025-11-29 06:22:22.136202761 +0000 UTC m=+0.101623064 container exec e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 01:22:22 np0005539510 podman[88109]: 2025-11-29 06:22:22.146192214 +0000 UTC m=+0.111612497 container exec_died e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 01:22:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:22.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:22 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 6.1 scrub starts
Nov 29 01:22:22 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 6.1 scrub ok
Nov 29 01:22:22 np0005539510 podman[88171]: 2025-11-29 06:22:22.370789302 +0000 UTC m=+0.051910125 container exec d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, com.redhat.component=keepalived-container, io.openshift.expose-services=, architecture=x86_64, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, distribution-scope=public, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived)
Nov 29 01:22:22 np0005539510 podman[88171]: 2025-11-29 06:22:22.40934683 +0000 UTC m=+0.090467643 container exec_died d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, io.buildah.version=1.28.2, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, name=keepalived, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793)
Nov 29 01:22:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:22.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:23 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:23 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:23 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:23 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 4.14 scrub starts
Nov 29 01:22:23 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 4.14 scrub ok
Nov 29 01:22:24 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:24 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:24 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:24 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:24 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:24.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:24 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 4.1d scrub starts
Nov 29 01:22:24 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 4.1d scrub ok
Nov 29 01:22:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:24.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:25 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Nov 29 01:22:25 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:25 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:25 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e80 e80: 3 total, 3 up, 3 in
Nov 29 01:22:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:26.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:26 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 80 pg[9.18( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=80) [2] r=0 lpr=80 pi=[58,80)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:26 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 80 pg[9.8( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=80) [2] r=0 lpr=80 pi=[58,80)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:26 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e80 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:22:26 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:22:26 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]': finished
Nov 29 01:22:26 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:26 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:22:26 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Nov 29 01:22:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:26.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:27 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e81 e81: 3 total, 3 up, 3 in
Nov 29 01:22:27 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 81 pg[9.19( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=81) [2] r=0 lpr=81 pi=[58,81)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:27 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 81 pg[9.9( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=81) [2] r=0 lpr=81 pi=[58,81)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:27 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]': finished
Nov 29 01:22:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:28.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:28 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e82 e82: 3 total, 3 up, 3 in
Nov 29 01:22:28 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 82 pg[9.9( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=82) [2]/[1] r=-1 lpr=82 pi=[58,82)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:28 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 82 pg[9.9( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=82) [2]/[1] r=-1 lpr=82 pi=[58,82)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:22:28 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 82 pg[9.18( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=82) [2]/[1] r=-1 lpr=82 pi=[58,82)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:28 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 82 pg[9.18( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=82) [2]/[1] r=-1 lpr=82 pi=[58,82)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:22:28 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 82 pg[9.8( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=82) [2]/[1] r=-1 lpr=82 pi=[58,82)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:28 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 82 pg[9.8( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=82) [2]/[1] r=-1 lpr=82 pi=[58,82)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:22:28 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 82 pg[9.19( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=82) [2]/[1] r=-1 lpr=82 pi=[58,82)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:28 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 82 pg[9.19( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=82) [2]/[1] r=-1 lpr=82 pi=[58,82)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:22:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:28.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:28 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e83 e83: 3 total, 3 up, 3 in
Nov 29 01:22:29 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Nov 29 01:22:29 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]': finished
Nov 29 01:22:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:30.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:30 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e84 e84: 3 total, 3 up, 3 in
Nov 29 01:22:30 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 84 pg[9.9( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=82/58 les/c/f=83/59/0 sis=84) [2] r=0 lpr=84 pi=[58,84)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:30 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 84 pg[9.9( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=82/58 les/c/f=83/59/0 sis=84) [2] r=0 lpr=84 pi=[58,84)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:30 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 84 pg[9.19( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=82/58 les/c/f=83/59/0 sis=84) [2] r=0 lpr=84 pi=[58,84)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:30 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 84 pg[9.19( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=82/58 les/c/f=83/59/0 sis=84) [2] r=0 lpr=84 pi=[58,84)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:30.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:31 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 4.15 scrub starts
Nov 29 01:22:31 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 4.15 scrub ok
Nov 29 01:22:31 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:22:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:32.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:32 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 4.1f scrub starts
Nov 29 01:22:32 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 4.1f scrub ok
Nov 29 01:22:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:32.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:33 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 4.1c scrub starts
Nov 29 01:22:33 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 4.1c scrub ok
Nov 29 01:22:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:34.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:34.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:36.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:36 np0005539510 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 01:22:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 01:22:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:36.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 01:22:36 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:22:37 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e85 e85: 3 total, 3 up, 3 in
Nov 29 01:22:37 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.4 scrub starts
Nov 29 01:22:37 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 85 pg[9.18( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=82/58 les/c/f=83/59/0 sis=85) [2] r=0 lpr=85 pi=[58,85)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:37 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 85 pg[9.18( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=82/58 les/c/f=83/59/0 sis=85) [2] r=0 lpr=85 pi=[58,85)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:37 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 85 pg[9.8( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=82/58 les/c/f=83/59/0 sis=85) [2] r=0 lpr=85 pi=[58,85)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:37 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 85 pg[9.8( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=6 ec=58/47 lis/c=82/58 les/c/f=83/59/0 sis=85) [2] r=0 lpr=85 pi=[58,85)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:37 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.4 scrub ok
Nov 29 01:22:37 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 85 pg[9.9( v 56'1130 (0'0,56'1130] local-lis/les=84/85 n=6 ec=58/47 lis/c=82/58 les/c/f=83/59/0 sis=84) [2] r=0 lpr=84 pi=[58,84)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:22:37 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 85 pg[9.19( v 56'1130 (0'0,56'1130] local-lis/les=84/85 n=5 ec=58/47 lis/c=82/58 les/c/f=83/59/0 sis=84) [2] r=0 lpr=84 pi=[58,84)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:22:38 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.3 scrub starts
Nov 29 01:22:38 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.3 scrub ok
Nov 29 01:22:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:22:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:38.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:22:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 01:22:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:38.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 01:22:39 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.6 scrub starts
Nov 29 01:22:39 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.6 scrub ok
Nov 29 01:22:39 np0005539510 systemd[72593]: Created slice User Background Tasks Slice.
Nov 29 01:22:39 np0005539510 systemd[72593]: Starting Cleanup of User's Temporary Files and Directories...
Nov 29 01:22:39 np0005539510 systemd[72593]: Finished Cleanup of User's Temporary Files and Directories.
Nov 29 01:22:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:22:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:40.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:22:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:22:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:40.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:22:41 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e86 e86: 3 total, 3 up, 3 in
Nov 29 01:22:41 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 86 pg[9.18( v 56'1130 (0'0,56'1130] local-lis/les=85/86 n=5 ec=58/47 lis/c=82/58 les/c/f=83/59/0 sis=85) [2] r=0 lpr=85 pi=[58,85)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:22:41 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 86 pg[9.8( v 56'1130 (0'0,56'1130] local-lis/les=85/86 n=6 ec=58/47 lis/c=82/58 les/c/f=83/59/0 sis=85) [2] r=0 lpr=85 pi=[58,85)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:22:41 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.f deep-scrub starts
Nov 29 01:22:41 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.f deep-scrub ok
Nov 29 01:22:41 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:22:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:42.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:22:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:42.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:22:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:44.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:44 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e87 e87: 3 total, 3 up, 3 in
Nov 29 01:22:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:22:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:44.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:22:46 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.8 scrub starts
Nov 29 01:22:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:46.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:46 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.8 scrub ok
Nov 29 01:22:46 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:46 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:46 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Nov 29 01:22:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:46.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:46 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:22:47 np0005539510 ceph-mon[77142]: Reconfiguring mon.compute-0 (monmap changed)...
Nov 29 01:22:47 np0005539510 ceph-mon[77142]: Reconfiguring daemon mon.compute-0 on compute-0
Nov 29 01:22:47 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:47 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:47 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.vxabpq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Nov 29 01:22:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:48.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:22:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:48.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:22:49 np0005539510 ceph-mon[77142]: Reconfiguring mgr.compute-0.vxabpq (monmap changed)...
Nov 29 01:22:49 np0005539510 ceph-mon[77142]: Reconfiguring daemon mgr.compute-0.vxabpq on compute-0
Nov 29 01:22:49 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Nov 29 01:22:50 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e88 e88: 3 total, 3 up, 3 in
Nov 29 01:22:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:50.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:50 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Nov 29 01:22:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:50.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:51 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.11 scrub starts
Nov 29 01:22:51 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.11 scrub ok
Nov 29 01:22:51 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e89 e89: 3 total, 3 up, 3 in
Nov 29 01:22:51 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:22:52 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Nov 29 01:22:52 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:52 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:22:52 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Nov 29 01:22:52 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.b scrub starts
Nov 29 01:22:52 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.b scrub ok
Nov 29 01:22:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:22:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:52.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:22:52 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e90 e90: 3 total, 3 up, 3 in
Nov 29 01:22:52 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 90 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=5 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=90 pruub=13.367929459s) [0] r=-1 lpr=90 pi=[77,90)/1 crt=56'1130 mlcod 0'0 active pruub 197.759460449s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:52 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 90 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=5 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=90 pruub=13.367857933s) [0] r=-1 lpr=90 pi=[77,90)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 197.759460449s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:22:52 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 90 pg[9.d( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=6 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=90 pruub=13.367554665s) [0] r=-1 lpr=90 pi=[77,90)/1 crt=56'1130 mlcod 0'0 active pruub 197.759429932s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:52 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 90 pg[9.d( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=6 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=90 pruub=13.367480278s) [0] r=-1 lpr=90 pi=[77,90)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 197.759429932s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:22:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:22:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:52.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:22:53 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.16 scrub starts
Nov 29 01:22:53 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.16 scrub ok
Nov 29 01:22:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:54.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:54.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:55 np0005539510 ceph-mon[77142]: Reconfiguring crash.compute-0 (monmap changed)...
Nov 29 01:22:55 np0005539510 ceph-mon[77142]: Reconfiguring daemon crash.compute-0 on compute-0
Nov 29 01:22:55 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Nov 29 01:22:55 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Nov 29 01:22:55 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Nov 29 01:22:55 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.3 scrub starts
Nov 29 01:22:55 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.3 scrub ok
Nov 29 01:22:56 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.10 scrub starts
Nov 29 01:22:56 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.10 scrub ok
Nov 29 01:22:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:56.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:22:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:56.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:22:56 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:22:57 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e91 e91: 3 total, 3 up, 3 in
Nov 29 01:22:57 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 91 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=5 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=91) [0]/[2] r=0 lpr=91 pi=[77,91)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:57 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 91 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=5 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=91) [0]/[2] r=0 lpr=91 pi=[77,91)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:57 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 91 pg[9.d( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=6 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=91) [0]/[2] r=0 lpr=91 pi=[77,91)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:22:57 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 91 pg[9.d( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=6 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=91) [0]/[2] r=0 lpr=91 pi=[77,91)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 29 01:22:57 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Nov 29 01:22:58 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.1 scrub starts
Nov 29 01:22:58 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.1 scrub ok
Nov 29 01:22:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:22:58.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:22:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:22:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:22:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:22:58.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:00.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:00.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:01 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e92 e92: 3 total, 3 up, 3 in
Nov 29 01:23:01 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:23:02 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.15 deep-scrub starts
Nov 29 01:23:02 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.15 deep-scrub ok
Nov 29 01:23:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:02.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:02 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 92 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=91/92 n=5 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=91) [0]/[2] async=[0] r=0 lpr=91 pi=[77,91)/1 crt=56'1130 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:23:02 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 92 pg[9.d( v 56'1130 (0'0,56'1130] local-lis/les=91/92 n=6 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=91) [0]/[2] async=[0] r=0 lpr=91 pi=[77,91)/1 crt=56'1130 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:23:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:02.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:03 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.5 scrub starts
Nov 29 01:23:03 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.5 scrub ok
Nov 29 01:23:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:04.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:04 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:04 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Nov 29 01:23:04 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:04 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:04 np0005539510 ceph-mon[77142]: Reconfiguring osd.1 (monmap changed)...
Nov 29 01:23:04 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Nov 29 01:23:04 np0005539510 ceph-mon[77142]: Reconfiguring daemon osd.1 on compute-0
Nov 29 01:23:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:04.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:06.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:06 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e93 e93: 3 total, 3 up, 3 in
Nov 29 01:23:06 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 93 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=91/92 n=5 ec=58/47 lis/c=91/77 les/c/f=92/79/0 sis=93 pruub=11.806956291s) [0] async=[0] r=-1 lpr=93 pi=[77,93)/1 crt=56'1130 mlcod 56'1130 active pruub 210.056961060s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:23:06 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 93 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=91/92 n=5 ec=58/47 lis/c=91/77 les/c/f=92/79/0 sis=93 pruub=11.806859970s) [0] r=-1 lpr=93 pi=[77,93)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 210.056961060s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:23:06 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 93 pg[9.d( v 56'1130 (0'0,56'1130] local-lis/les=91/92 n=6 ec=58/47 lis/c=91/77 les/c/f=92/79/0 sis=93 pruub=11.809009552s) [0] async=[0] r=-1 lpr=93 pi=[77,93)/1 crt=56'1130 mlcod 56'1130 active pruub 210.059432983s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:23:06 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 93 pg[9.d( v 56'1130 (0'0,56'1130] local-lis/les=91/92 n=6 ec=58/47 lis/c=91/77 les/c/f=92/79/0 sis=93 pruub=11.808897972s) [0] r=-1 lpr=93 pi=[77,93)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 210.059432983s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:23:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:06.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:06 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:23:07 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.2 scrub starts
Nov 29 01:23:07 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.2 scrub ok
Nov 29 01:23:07 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Nov 29 01:23:07 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Nov 29 01:23:07 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Nov 29 01:23:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:23:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:08.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:23:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:23:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:08.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:23:09 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.16 scrub starts
Nov 29 01:23:09 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.16 scrub ok
Nov 29 01:23:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:10.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:23:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:10.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:23:11 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:23:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:12.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:12 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e94 e94: 3 total, 3 up, 3 in
Nov 29 01:23:12 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 94 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=5 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=94 pruub=14.999697685s) [0] r=-1 lpr=94 pi=[71,94)/1 crt=56'1130 mlcod 0'0 active pruub 218.826431274s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:23:12 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 94 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=5 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=94 pruub=14.999450684s) [0] r=-1 lpr=94 pi=[71,94)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 218.826431274s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:23:12 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 94 pg[9.f( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=6 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=94 pruub=15.610712051s) [0] r=-1 lpr=94 pi=[71,94)/1 crt=56'1130 mlcod 0'0 active pruub 219.438018799s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:23:12 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 94 pg[9.f( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=6 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=94 pruub=15.610674858s) [0] r=-1 lpr=94 pi=[71,94)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 219.438018799s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:23:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:12.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:13 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.9 deep-scrub starts
Nov 29 01:23:13 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.9 deep-scrub ok
Nov 29 01:23:14 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Nov 29 01:23:14 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Nov 29 01:23:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:14.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:23:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:14.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:23:15 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.a scrub starts
Nov 29 01:23:15 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.a scrub ok
Nov 29 01:23:16 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.a scrub starts
Nov 29 01:23:16 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.a scrub ok
Nov 29 01:23:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:16.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:16 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e95 e95: 3 total, 3 up, 3 in
Nov 29 01:23:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:23:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:16.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:23:16 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:23:17 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Nov 29 01:23:17 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Nov 29 01:23:17 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Nov 29 01:23:18 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.f scrub starts
Nov 29 01:23:18 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.f scrub ok
Nov 29 01:23:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:18.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:18 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e96 e96: 3 total, 3 up, 3 in
Nov 29 01:23:18 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 96 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=5 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=96) [0]/[2] r=0 lpr=96 pi=[71,96)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:23:18 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 96 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=5 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=96) [0]/[2] r=0 lpr=96 pi=[71,96)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 29 01:23:18 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 96 pg[9.f( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=6 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=96) [0]/[2] r=0 lpr=96 pi=[71,96)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:23:18 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 96 pg[9.f( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=6 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=96) [0]/[2] r=0 lpr=96 pi=[71,96)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 29 01:23:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:18.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:19 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.d scrub starts
Nov 29 01:23:19 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.d scrub ok
Nov 29 01:23:19 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Nov 29 01:23:19 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Nov 29 01:23:19 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:19 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:19 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Nov 29 01:23:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:23:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:20.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:23:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:20.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:22 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e97 e97: 3 total, 3 up, 3 in
Nov 29 01:23:22 np0005539510 ceph-mon[77142]: Reconfiguring crash.compute-1 (monmap changed)...
Nov 29 01:23:22 np0005539510 ceph-mon[77142]: Reconfiguring daemon crash.compute-1 on compute-1
Nov 29 01:23:22 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:22.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:22 np0005539510 python3.9[88799]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:23:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:22.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:23 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.19 scrub starts
Nov 29 01:23:23 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.19 scrub ok
Nov 29 01:23:23 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:23:23 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:23 np0005539510 ceph-mon[77142]: Reconfiguring osd.0 (monmap changed)...
Nov 29 01:23:23 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Nov 29 01:23:23 np0005539510 ceph-mon[77142]: Reconfiguring daemon osd.0 on compute-1
Nov 29 01:23:23 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 97 pg[9.f( v 56'1130 (0'0,56'1130] local-lis/les=96/97 n=6 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=96) [0]/[2] async=[0] r=0 lpr=96 pi=[71,96)/1 crt=56'1130 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:23:23 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 97 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=96/97 n=5 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=96) [0]/[2] async=[0] r=0 lpr=96 pi=[71,96)/1 crt=56'1130 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:23:23 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.e scrub starts
Nov 29 01:23:23 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.e scrub ok
Nov 29 01:23:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:24.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:24 np0005539510 python3.9[89088]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Nov 29 01:23:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:23:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:24.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:23:25 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:25 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:25 np0005539510 ceph-mon[77142]: Reconfiguring mon.compute-1 (monmap changed)...
Nov 29 01:23:25 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Nov 29 01:23:25 np0005539510 ceph-mon[77142]: Reconfiguring daemon mon.compute-1 on compute-1
Nov 29 01:23:26 np0005539510 python3.9[89240]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Nov 29 01:23:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:26.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:26 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e98 e98: 3 total, 3 up, 3 in
Nov 29 01:23:26 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 98 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=96/97 n=5 ec=58/47 lis/c=96/71 les/c/f=97/72/0 sis=98 pruub=13.176123619s) [0] async=[0] r=-1 lpr=98 pi=[71,98)/1 crt=56'1130 mlcod 56'1130 active pruub 231.373046875s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:23:26 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 98 pg[9.f( v 56'1130 (0'0,56'1130] local-lis/les=96/97 n=6 ec=58/47 lis/c=96/71 les/c/f=97/72/0 sis=98 pruub=13.171274185s) [0] async=[0] r=-1 lpr=98 pi=[71,98)/1 crt=56'1130 mlcod 56'1130 active pruub 231.368865967s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:23:26 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 98 pg[9.f( v 56'1130 (0'0,56'1130] local-lis/les=96/97 n=6 ec=58/47 lis/c=96/71 les/c/f=97/72/0 sis=98 pruub=13.171154976s) [0] r=-1 lpr=98 pi=[71,98)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 231.368865967s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:23:26 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 98 pg[9.1f( v 56'1130 (0'0,56'1130] local-lis/les=96/97 n=5 ec=58/47 lis/c=96/71 les/c/f=97/72/0 sis=98 pruub=13.175400734s) [0] r=-1 lpr=98 pi=[71,98)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 231.373046875s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:23:26 np0005539510 python3.9[89393]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:23:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:26.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:27 np0005539510 python3.9[89545]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Nov 29 01:23:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:28.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:23:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:28.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:23:29 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.11 scrub starts
Nov 29 01:23:29 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.11 scrub ok
Nov 29 01:23:29 np0005539510 python3.9[89699]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:23:29 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e98 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:23:29 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e99 e99: 3 total, 3 up, 3 in
Nov 29 01:23:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:30.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:30 np0005539510 python3.9[89852]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:23:30 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:30.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:31 np0005539510 python3.9[89931]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:23:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:23:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:32.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:23:32 np0005539510 python3.9[90083]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:23:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:32.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:33 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.3 scrub starts
Nov 29 01:23:33 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.3 scrub ok
Nov 29 01:23:33 np0005539510 podman[90227]: 2025-11-29 06:23:33.277203305 +0000 UTC m=+0.063323011 container create 29d71691ef94bfeda07eb1f27d9963ad20fccdaa0e89ae9811abe22a483235b8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nostalgic_feynman, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 01:23:33 np0005539510 systemd[1]: Started libpod-conmon-29d71691ef94bfeda07eb1f27d9963ad20fccdaa0e89ae9811abe22a483235b8.scope.
Nov 29 01:23:33 np0005539510 podman[90227]: 2025-11-29 06:23:33.25173897 +0000 UTC m=+0.037858666 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:23:33 np0005539510 systemd[1]: Started libcrun container.
Nov 29 01:23:33 np0005539510 podman[90227]: 2025-11-29 06:23:33.396717645 +0000 UTC m=+0.182837361 container init 29d71691ef94bfeda07eb1f27d9963ad20fccdaa0e89ae9811abe22a483235b8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nostalgic_feynman, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Nov 29 01:23:33 np0005539510 podman[90227]: 2025-11-29 06:23:33.403537446 +0000 UTC m=+0.189657122 container start 29d71691ef94bfeda07eb1f27d9963ad20fccdaa0e89ae9811abe22a483235b8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nostalgic_feynman, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 01:23:33 np0005539510 podman[90227]: 2025-11-29 06:23:33.408480527 +0000 UTC m=+0.194600253 container attach 29d71691ef94bfeda07eb1f27d9963ad20fccdaa0e89ae9811abe22a483235b8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nostalgic_feynman, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.39.3, ceph=True, CEPH_REF=reef, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Nov 29 01:23:33 np0005539510 nostalgic_feynman[90268]: 167 167
Nov 29 01:23:33 np0005539510 systemd[1]: libpod-29d71691ef94bfeda07eb1f27d9963ad20fccdaa0e89ae9811abe22a483235b8.scope: Deactivated successfully.
Nov 29 01:23:33 np0005539510 podman[90301]: 2025-11-29 06:23:33.46134193 +0000 UTC m=+0.031004544 container died 29d71691ef94bfeda07eb1f27d9963ad20fccdaa0e89ae9811abe22a483235b8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nostalgic_feynman, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 01:23:33 np0005539510 systemd[1]: var-lib-containers-storage-overlay-e27a7cd7a6c72ac3bbe825aa7e204f1c284b773448ac3dc4a6199262f0f0a76a-merged.mount: Deactivated successfully.
Nov 29 01:23:33 np0005539510 podman[90301]: 2025-11-29 06:23:33.507222197 +0000 UTC m=+0.076884811 container remove 29d71691ef94bfeda07eb1f27d9963ad20fccdaa0e89ae9811abe22a483235b8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nostalgic_feynman, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3)
Nov 29 01:23:33 np0005539510 systemd[1]: libpod-conmon-29d71691ef94bfeda07eb1f27d9963ad20fccdaa0e89ae9811abe22a483235b8.scope: Deactivated successfully.
Nov 29 01:23:33 np0005539510 python3.9[90393]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Nov 29 01:23:34 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.12 scrub starts
Nov 29 01:23:34 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.12 scrub ok
Nov 29 01:23:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:34.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:34 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e99 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:23:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:34.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:35 np0005539510 python3.9[90547]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Nov 29 01:23:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:36.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:23:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:36.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:23:37 np0005539510 python3.9[90701]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 01:23:37 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e100 e100: 3 total, 3 up, 3 in
Nov 29 01:23:37 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:37 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Nov 29 01:23:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:38.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:38 np0005539510 ceph-mon[77142]: Reconfiguring mon.compute-2 (monmap changed)...
Nov 29 01:23:38 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Nov 29 01:23:38 np0005539510 ceph-mon[77142]: Reconfiguring daemon mon.compute-2 on compute-2
Nov 29 01:23:38 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Nov 29 01:23:38 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Nov 29 01:23:38 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Nov 29 01:23:38 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:38 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:38 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Nov 29 01:23:38 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e101 e101: 3 total, 3 up, 3 in
Nov 29 01:23:38 np0005539510 python3.9[90935]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Nov 29 01:23:38 np0005539510 podman[91049]: 2025-11-29 06:23:38.854343274 +0000 UTC m=+0.056193301 container exec 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 01:23:38 np0005539510 podman[91049]: 2025-11-29 06:23:38.962325969 +0000 UTC m=+0.164175966 container exec_died 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Nov 29 01:23:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:38.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:40 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.1e scrub starts
Nov 29 01:23:40 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 10.1e scrub ok
Nov 29 01:23:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:40.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:40.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:41 np0005539510 python3.9[91390]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:23:41 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:23:41 np0005539510 podman[91279]: 2025-11-29 06:23:41.507974835 +0000 UTC m=+1.905463955 container exec e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 01:23:41 np0005539510 podman[91279]: 2025-11-29 06:23:41.515000311 +0000 UTC m=+1.912489401 container exec_died e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 01:23:41 np0005539510 podman[91443]: 2025-11-29 06:23:41.717063541 +0000 UTC m=+0.058830561 container exec d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Nov 29 01:23:41 np0005539510 podman[91463]: 2025-11-29 06:23:41.819987861 +0000 UTC m=+0.083049784 container exec_died d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, name=keepalived, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, io.openshift.tags=Ceph keepalived, io.buildah.version=1.28.2, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, build-date=2023-02-22T09:23:20)
Nov 29 01:23:42 np0005539510 podman[91443]: 2025-11-29 06:23:42.049168601 +0000 UTC m=+0.390935611 container exec_died d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, release=1793, io.buildah.version=1.28.2, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, vendor=Red Hat, Inc., version=2.2.4, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Nov 29 01:23:42 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Nov 29 01:23:42 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Nov 29 01:23:42 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]': finished
Nov 29 01:23:42 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.13 scrub starts
Nov 29 01:23:42 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.13 scrub ok
Nov 29 01:23:42 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e102 e102: 3 total, 3 up, 3 in
Nov 29 01:23:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:23:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:42.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:23:42 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e103 e103: 3 total, 3 up, 3 in
Nov 29 01:23:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:42.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:43 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Nov 29 01:23:43 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:43 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:43 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:43 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:43 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:43 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:43 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]': finished
Nov 29 01:23:43 np0005539510 python3.9[91759]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:23:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:23:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:44.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:23:44 np0005539510 python3.9[91912]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:23:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:44.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:45 np0005539510 python3.9[91990]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:23:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:46.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:46 np0005539510 python3.9[92142]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:23:46 np0005539510 python3.9[92221]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:23:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:46.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:47 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:23:47 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e104 e104: 3 total, 3 up, 3 in
Nov 29 01:23:48 np0005539510 python3.9[92373]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:23:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:23:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:48.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:23:48 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:23:48 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:23:48 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:23:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:49.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:50 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e105 e105: 3 total, 3 up, 3 in
Nov 29 01:23:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:50.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:51.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:51 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.1c scrub starts
Nov 29 01:23:51 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.1c scrub ok
Nov 29 01:23:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:52.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:52 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e105 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:23:52 np0005539510 python3.9[92527]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:23:52 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.1f scrub starts
Nov 29 01:23:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:23:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:53.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:23:53 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.1f scrub ok
Nov 29 01:23:53 np0005539510 python3.9[92679]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 29 01:23:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:54.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:23:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:55.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:23:55 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.c scrub starts
Nov 29 01:23:55 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 8.c scrub ok
Nov 29 01:23:55 np0005539510 python3.9[92830]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:23:56 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.17 scrub starts
Nov 29 01:23:56 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 11.17 scrub ok
Nov 29 01:23:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:23:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:56.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:23:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:57.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:57 np0005539510 python3.9[92983]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:23:57 np0005539510 systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 29 01:23:57 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e105 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:23:57 np0005539510 systemd[1]: tuned.service: Deactivated successfully.
Nov 29 01:23:57 np0005539510 systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 29 01:23:57 np0005539510 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 29 01:23:57 np0005539510 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 29 01:23:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:23:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:23:58.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:23:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:23:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:23:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:23:59.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:23:59 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.17 deep-scrub starts
Nov 29 01:23:59 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.17 deep-scrub ok
Nov 29 01:23:59 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e106 e106: 3 total, 3 up, 3 in
Nov 29 01:24:00 np0005539510 python3.9[93195]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 29 01:24:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:24:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:00.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:24:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:01.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:01 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.1b scrub starts
Nov 29 01:24:01 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.1b scrub ok
Nov 29 01:24:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:02.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:02 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e106 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:24:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:03.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:03 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.7 deep-scrub starts
Nov 29 01:24:03 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.7 deep-scrub ok
Nov 29 01:24:04 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.b scrub starts
Nov 29 01:24:04 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.b scrub ok
Nov 29 01:24:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:04.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:04 np0005539510 python3.9[93350]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:24:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:05.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:05 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e107 e107: 3 total, 3 up, 3 in
Nov 29 01:24:05 np0005539510 python3.9[93504]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:24:05 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Nov 29 01:24:05 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Nov 29 01:24:05 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Nov 29 01:24:05 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Nov 29 01:24:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:06.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:07.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:07 np0005539510 systemd-logind[784]: Session 34 logged out. Waiting for processes to exit.
Nov 29 01:24:07 np0005539510 systemd[1]: session-34.scope: Deactivated successfully.
Nov 29 01:24:07 np0005539510 systemd[1]: session-34.scope: Consumed 1min 7.390s CPU time.
Nov 29 01:24:07 np0005539510 systemd-logind[784]: Removed session 34.
Nov 29 01:24:07 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:24:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:08.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:09.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:24:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:10.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:24:10 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e108 e108: 3 total, 3 up, 3 in
Nov 29 01:24:10 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Nov 29 01:24:10 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:24:10 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Nov 29 01:24:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:11.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:11 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.13 scrub starts
Nov 29 01:24:11 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.13 scrub ok
Nov 29 01:24:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:12.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:12 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:24:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:24:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:13.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:24:13 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e109 e109: 3 total, 3 up, 3 in
Nov 29 01:24:13 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:24:13 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Nov 29 01:24:13 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Nov 29 01:24:13 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Nov 29 01:24:13 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Nov 29 01:24:13 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Nov 29 01:24:13 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Nov 29 01:24:14 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.3 scrub starts
Nov 29 01:24:14 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.3 scrub ok
Nov 29 01:24:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:14.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:24:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:15.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:24:16 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.15 scrub starts
Nov 29 01:24:16 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.15 scrub ok
Nov 29 01:24:16 np0005539510 systemd-logind[784]: New session 35 of user zuul.
Nov 29 01:24:16 np0005539510 systemd[1]: Started Session 35 of User zuul.
Nov 29 01:24:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:16.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:17.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:17 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.5 scrub starts
Nov 29 01:24:17 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.5 scrub ok
Nov 29 01:24:17 np0005539510 python3.9[93740]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:24:17 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:24:18 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.9 scrub starts
Nov 29 01:24:18 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.9 scrub ok
Nov 29 01:24:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:18.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:19.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:19 np0005539510 python3.9[93897]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Nov 29 01:24:19 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.19 scrub starts
Nov 29 01:24:19 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.19 scrub ok
Nov 29 01:24:19 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e110 e110: 3 total, 3 up, 3 in
Nov 29 01:24:19 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Nov 29 01:24:19 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Nov 29 01:24:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 110 pg[9.15( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=4 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=110 pruub=14.469883919s) [0] r=-1 lpr=110 pi=[77,110)/1 crt=56'1130 mlcod 0'0 active pruub 285.761932373s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:24:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 110 pg[9.15( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=4 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=110 pruub=14.469782829s) [0] r=-1 lpr=110 pi=[77,110)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 285.761932373s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:24:20 np0005539510 python3.9[94100]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:24:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:20.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:21 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e111 e111: 3 total, 3 up, 3 in
Nov 29 01:24:21 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 111 pg[9.15( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=4 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=111) [0]/[2] r=0 lpr=111 pi=[77,111)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:24:21 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 111 pg[9.15( v 56'1130 (0'0,56'1130] local-lis/les=77/79 n=4 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=111) [0]/[2] r=0 lpr=111 pi=[77,111)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 29 01:24:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:21.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:21 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 111 pg[9.16( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=78/78 les/c/f=79/79/0 sis=111) [2] r=0 lpr=111 pi=[78,111)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:24:21 np0005539510 python3.9[94187]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 01:24:22 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.8 scrub starts
Nov 29 01:24:22 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.8 scrub ok
Nov 29 01:24:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:22.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:23.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:23 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.18 scrub starts
Nov 29 01:24:24 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:24:24 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.18 scrub ok
Nov 29 01:24:24 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Nov 29 01:24:24 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Nov 29 01:24:24 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Nov 29 01:24:24 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Nov 29 01:24:24 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Nov 29 01:24:24 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Nov 29 01:24:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:24.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:25 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e112 e112: 3 total, 3 up, 3 in
Nov 29 01:24:25 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 112 pg[9.16( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=78/78 les/c/f=79/79/0 sis=112) [2]/[0] r=-1 lpr=112 pi=[78,112)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:24:25 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 112 pg[9.16( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=78/78 les/c/f=79/79/0 sis=112) [2]/[0] r=-1 lpr=112 pi=[78,112)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:24:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:25.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:25 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 112 pg[9.15( v 56'1130 (0'0,56'1130] local-lis/les=111/112 n=4 ec=58/47 lis/c=77/77 les/c/f=79/79/0 sis=111) [0]/[2] async=[0] r=0 lpr=111 pi=[77,111)/1 crt=56'1130 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:24:26 np0005539510 python3.9[94342]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:24:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:26.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:27.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:28.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:29.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:29 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:24:29 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Nov 29 01:24:29 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Nov 29 01:24:29 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Nov 29 01:24:29 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]': finished
Nov 29 01:24:29 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Nov 29 01:24:29 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Nov 29 01:24:29 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Nov 29 01:24:30 np0005539510 python3.9[94497]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:24:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:30.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:30 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e113 e113: 3 total, 3 up, 3 in
Nov 29 01:24:30 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 113 pg[9.15( v 56'1130 (0'0,56'1130] local-lis/les=111/112 n=4 ec=58/47 lis/c=111/77 les/c/f=112/79/0 sis=113 pruub=10.494091034s) [0] async=[0] r=-1 lpr=113 pi=[77,113)/1 crt=56'1130 mlcod 56'1130 active pruub 292.558746338s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:24:30 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 113 pg[9.15( v 56'1130 (0'0,56'1130] local-lis/les=111/112 n=4 ec=58/47 lis/c=111/77 les/c/f=112/79/0 sis=113 pruub=10.493929863s) [0] r=-1 lpr=113 pi=[77,113)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 292.558746338s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:24:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:31.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:32.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:32 np0005539510 python3.9[94652]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:24:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:24:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:33.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:24:34 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e113 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:24:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:34.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:34 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e114 e114: 3 total, 3 up, 3 in
Nov 29 01:24:34 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 114 pg[9.16( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=112/78 les/c/f=113/79/0 sis=114) [2] r=0 lpr=114 pi=[78,114)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:24:34 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 114 pg[9.16( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=112/78 les/c/f=113/79/0 sis=114) [2] r=0 lpr=114 pi=[78,114)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:24:34 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Nov 29 01:24:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:24:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:35.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:24:35 np0005539510 python3.9[94805]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Nov 29 01:24:36 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e115 e115: 3 total, 3 up, 3 in
Nov 29 01:24:36 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 115 pg[9.16( v 56'1130 (0'0,56'1130] local-lis/les=114/115 n=5 ec=58/47 lis/c=112/78 les/c/f=113/79/0 sis=114) [2] r=0 lpr=114 pi=[78,114)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:24:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:36.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:36 np0005539510 python3.9[94955]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:24:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:24:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:37.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:24:37 np0005539510 python3.9[95114]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:24:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:38.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:39.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:39 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:24:39 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e116 e116: 3 total, 3 up, 3 in
Nov 29 01:24:39 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Nov 29 01:24:40 np0005539510 python3.9[95318]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:24:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:24:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:40.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:24:40 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e117 e117: 3 total, 3 up, 3 in
Nov 29 01:24:40 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 117 pg[9.19( v 56'1130 (0'0,56'1130] local-lis/les=84/85 n=7 ec=58/47 lis/c=84/84 les/c/f=85/85/0 sis=117 pruub=12.623802185s) [1] r=-1 lpr=117 pi=[84,117)/1 crt=56'1130 mlcod 0'0 active pruub 304.739318848s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:24:40 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 117 pg[9.19( v 56'1130 (0'0,56'1130] local-lis/les=84/85 n=7 ec=58/47 lis/c=84/84 les/c/f=85/85/0 sis=117 pruub=12.623720169s) [1] r=-1 lpr=117 pi=[84,117)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 304.739318848s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:24:40 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]': finished
Nov 29 01:24:40 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Nov 29 01:24:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:41.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:41 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]': finished
Nov 29 01:24:41 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e118 e118: 3 total, 3 up, 3 in
Nov 29 01:24:41 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 118 pg[9.19( v 56'1130 (0'0,56'1130] local-lis/les=84/85 n=7 ec=58/47 lis/c=84/84 les/c/f=85/85/0 sis=118) [1]/[2] r=0 lpr=118 pi=[84,118)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:24:41 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 118 pg[9.19( v 56'1130 (0'0,56'1130] local-lis/les=84/85 n=7 ec=58/47 lis/c=84/84 les/c/f=85/85/0 sis=118) [1]/[2] r=0 lpr=118 pi=[84,118)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 29 01:24:42 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.16 scrub starts
Nov 29 01:24:42 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.16 scrub ok
Nov 29 01:24:42 np0005539510 python3.9[95606]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 29 01:24:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:24:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:42.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:24:42 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Nov 29 01:24:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:43.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:43 np0005539510 python3.9[95757]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:24:43 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e119 e119: 3 total, 3 up, 3 in
Nov 29 01:24:43 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 119 pg[9.19( v 56'1130 (0'0,56'1130] local-lis/les=118/119 n=7 ec=58/47 lis/c=84/84 les/c/f=85/85/0 sis=118) [1]/[2] async=[1] r=0 lpr=118 pi=[84,118)/1 crt=56'1130 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:24:44 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Nov 29 01:24:44 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e119 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:24:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:44.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:44 np0005539510 python3.9[95911]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:24:44 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e120 e120: 3 total, 3 up, 3 in
Nov 29 01:24:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:45.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:46 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Nov 29 01:24:46 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e121 e121: 3 total, 3 up, 3 in
Nov 29 01:24:46 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 121 pg[9.1b( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=2 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=121 pruub=9.605250359s) [1] r=-1 lpr=121 pi=[71,121)/1 crt=56'1130 mlcod 0'0 active pruub 307.441101074s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:24:46 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 121 pg[9.1b( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=2 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=121 pruub=9.605175018s) [1] r=-1 lpr=121 pi=[71,121)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 307.441101074s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:24:46 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 121 pg[9.19( v 56'1130 (0'0,56'1130] local-lis/les=118/119 n=7 ec=58/47 lis/c=118/84 les/c/f=119/85/0 sis=121 pruub=13.593140602s) [1] async=[1] r=-1 lpr=121 pi=[84,121)/1 crt=56'1130 mlcod 56'1130 active pruub 311.429443359s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:24:46 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 121 pg[9.19( v 56'1130 (0'0,56'1130] local-lis/les=118/119 n=7 ec=58/47 lis/c=118/84 les/c/f=119/85/0 sis=121 pruub=13.593073845s) [1] r=-1 lpr=121 pi=[84,121)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 311.429443359s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:24:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:24:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:46.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:24:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:47.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:47 np0005539510 python3.9[96067]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:24:48 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Nov 29 01:24:48 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Nov 29 01:24:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:48.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:24:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:49.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:24:49 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:24:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:24:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:50.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:24:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:51.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:51 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e122 e122: 3 total, 3 up, 3 in
Nov 29 01:24:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 122 pg[9.1b( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=2 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=122) [1]/[2] r=0 lpr=122 pi=[71,122)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:24:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 122 pg[9.1b( v 56'1130 (0'0,56'1130] local-lis/les=71/72 n=2 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=122) [1]/[2] r=0 lpr=122 pi=[71,122)/1 crt=56'1130 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 29 01:24:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:52.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:52 np0005539510 python3.9[96223]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:24:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:24:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:53.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:24:53 np0005539510 python3.9[96377]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Nov 29 01:24:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:54.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:54 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e123 e123: 3 total, 3 up, 3 in
Nov 29 01:24:54 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Nov 29 01:24:54 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:24:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:24:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:55.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:24:55 np0005539510 systemd[1]: session-35.scope: Deactivated successfully.
Nov 29 01:24:55 np0005539510 systemd[1]: session-35.scope: Consumed 19.758s CPU time.
Nov 29 01:24:55 np0005539510 systemd-logind[784]: Session 35 logged out. Waiting for processes to exit.
Nov 29 01:24:55 np0005539510 systemd-logind[784]: Removed session 35.
Nov 29 01:24:55 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 123 pg[9.1b( v 56'1130 (0'0,56'1130] local-lis/les=122/123 n=2 ec=58/47 lis/c=71/71 les/c/f=72/72/0 sis=122) [1]/[2] async=[1] r=0 lpr=122 pi=[71,122)/1 crt=56'1130 mlcod 0'0 active+remapped mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:24:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:56.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:57.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:24:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:24:58.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:24:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:24:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:24:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:24:59.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:24:59 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e124 e124: 3 total, 3 up, 3 in
Nov 29 01:24:59 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 124 pg[9.1b( v 56'1130 (0'0,56'1130] local-lis/les=122/123 n=2 ec=58/47 lis/c=122/71 les/c/f=123/72/0 sis=124 pruub=12.276672363s) [1] async=[1] r=-1 lpr=124 pi=[71,124)/1 crt=56'1130 mlcod 56'1130 active pruub 323.284393311s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:24:59 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 124 pg[9.1b( v 56'1130 (0'0,56'1130] local-lis/les=122/123 n=2 ec=58/47 lis/c=122/71 les/c/f=123/72/0 sis=124 pruub=12.276548386s) [1] r=-1 lpr=124 pi=[71,124)/1 crt=56'1130 mlcod 0'0 unknown NOTIFY pruub 323.284393311s@ mbc={}] state<Start>: transitioning to Stray
Nov 29 01:24:59 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e124 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:25:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:00.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:01.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:02 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e125 e125: 3 total, 3 up, 3 in
Nov 29 01:25:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:02.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:02 np0005539510 systemd-logind[784]: New session 36 of user zuul.
Nov 29 01:25:02 np0005539510 systemd[1]: Started Session 36 of User zuul.
Nov 29 01:25:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:03.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:03 np0005539510 python3.9[96610]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:25:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:04.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:04 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e125 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:25:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:05.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:05 np0005539510 python3.9[96765]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:25:06 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e126 e126: 3 total, 3 up, 3 in
Nov 29 01:25:06 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Nov 29 01:25:06 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Nov 29 01:25:06 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Nov 29 01:25:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:06.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:06 np0005539510 python3.9[96959]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:25:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:07.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:07 np0005539510 systemd-logind[784]: Session 36 logged out. Waiting for processes to exit.
Nov 29 01:25:07 np0005539510 systemd[1]: session-36.scope: Deactivated successfully.
Nov 29 01:25:07 np0005539510 systemd[1]: session-36.scope: Consumed 2.256s CPU time.
Nov 29 01:25:07 np0005539510 systemd-logind[784]: Removed session 36.
Nov 29 01:25:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:08.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:09.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:10.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:25:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:11.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:25:11 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Nov 29 01:25:12 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:25:12 np0005539510 podman[97158]: 2025-11-29 06:25:12.158887875 +0000 UTC m=+3.279462817 container exec 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3)
Nov 29 01:25:12 np0005539510 podman[97158]: 2025-11-29 06:25:12.257000656 +0000 UTC m=+3.377575498 container exec_died 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 01:25:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:12.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:13.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:13 np0005539510 podman[97316]: 2025-11-29 06:25:13.812095489 +0000 UTC m=+0.859998325 container exec e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 01:25:13 np0005539510 podman[97316]: 2025-11-29 06:25:13.884346586 +0000 UTC m=+0.932249372 container exec_died e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 01:25:14 np0005539510 podman[97380]: 2025-11-29 06:25:14.261041485 +0000 UTC m=+0.139702441 container exec d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2023-02-22T09:23:20, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., name=keepalived, vcs-type=git, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, release=1793, version=2.2.4, distribution-scope=public, io.buildah.version=1.28.2, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9)
Nov 29 01:25:14 np0005539510 podman[97400]: 2025-11-29 06:25:14.355048218 +0000 UTC m=+0.072977269 container exec_died d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, release=1793, vendor=Red Hat, Inc., vcs-type=git, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, com.redhat.component=keepalived-container, distribution-scope=public, architecture=x86_64, io.openshift.tags=Ceph keepalived, name=keepalived, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.display-name=Keepalived on RHEL 9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, summary=Provides keepalived on RHEL 9 for Ceph., io.buildah.version=1.28.2)
Nov 29 01:25:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:14.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:14 np0005539510 podman[97380]: 2025-11-29 06:25:14.593301911 +0000 UTC m=+0.471962837 container exec_died d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, vendor=Red Hat, Inc., name=keepalived, version=2.2.4, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, io.buildah.version=1.28.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9)
Nov 29 01:25:14 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e127 e127: 3 total, 3 up, 3 in
Nov 29 01:25:14 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Nov 29 01:25:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:15.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:15 np0005539510 systemd-logind[784]: New session 37 of user zuul.
Nov 29 01:25:15 np0005539510 systemd[1]: Started Session 37 of User zuul.
Nov 29 01:25:15 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 127 pg[9.1d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=127) [2] r=0 lpr=127 pi=[93,127)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:25:16 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e128 e128: 3 total, 3 up, 3 in
Nov 29 01:25:16 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Nov 29 01:25:16 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Nov 29 01:25:16 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Nov 29 01:25:16 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Nov 29 01:25:16 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:25:16 np0005539510 python3.9[97567]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:25:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:16.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:17.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:17 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:25:17 np0005539510 python3.9[97722]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:25:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:18.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:18 np0005539510 python3.9[97878]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:25:19 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e129 e129: 3 total, 3 up, 3 in
Nov 29 01:25:19 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:25:19 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Nov 29 01:25:19 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Nov 29 01:25:19 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Nov 29 01:25:19 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:25:19 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Nov 29 01:25:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:19.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 129 pg[9.1d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:25:19 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 129 pg[9.1d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:25:19 np0005539510 python3.9[97963]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:25:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:20.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:21.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:22 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:25:22 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Nov 29 01:25:22 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Nov 29 01:25:22 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:25:22 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:25:22 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 29 01:25:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:25:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:22.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:25:22 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e130 e130: 3 total, 3 up, 3 in
Nov 29 01:25:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:23.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:23 np0005539510 python3.9[98300]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:25:24 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:25:24 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Nov 29 01:25:24 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:25:24 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:25:24 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e131 e131: 3 total, 3 up, 3 in
Nov 29 01:25:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:25:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:24.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:25:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:25.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:25 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e132 e132: 3 total, 3 up, 3 in
Nov 29 01:25:25 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:25:25 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:25:25 np0005539510 python3.9[98496]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:25:26 np0005539510 python3.9[98648]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:25:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:26.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:26 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e133 e133: 3 total, 3 up, 3 in
Nov 29 01:25:26 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 133 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=132/133 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:25:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:25:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:27.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:25:27 np0005539510 python3.9[98814]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:25:27 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:25:27 np0005539510 python3.9[98892]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:25:27 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e134 e134: 3 total, 3 up, 3 in
Nov 29 01:25:28 np0005539510 python3.9[99044]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:25:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:25:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:28.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:25:28 np0005539510 python3.9[99123]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:25:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:25:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:29.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:25:30 np0005539510 python3.9[99275]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:25:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:25:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:30.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:25:30 np0005539510 python3.9[99428]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:25:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:31.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:31 np0005539510 python3.9[99580]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:25:31 np0005539510 python3.9[99732]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:25:32 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.1d scrub starts
Nov 29 01:25:32 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.1d scrub ok
Nov 29 01:25:32 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:25:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:32.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:33.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:33 np0005539510 python3.9[99885]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:25:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:25:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:34.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:25:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:35.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:25:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:36.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:25:37 np0005539510 python3.9[100040]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:25:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:25:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:37.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:25:37 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:25:37 np0005539510 python3.9[100194]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:25:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:38.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:38 np0005539510 python3.9[100347]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:25:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:39.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:39 np0005539510 python3.9[100499]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:25:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:25:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:40.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:25:40 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e135 e135: 3 total, 3 up, 3 in
Nov 29 01:25:40 np0005539510 python3.9[100703]: ansible-service_facts Invoked
Nov 29 01:25:40 np0005539510 network[100720]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:25:40 np0005539510 network[100721]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:25:40 np0005539510 network[100722]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:25:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:41.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:41 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 01:25:42 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e135 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:25:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:42.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:43.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:44.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:45.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:46.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:46 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e136 e136: 3 total, 3 up, 3 in
Nov 29 01:25:46 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 01:25:46 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 01:25:46 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 01:25:46 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 29 01:25:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:47.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:47 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:25:48 np0005539510 python3.9[101177]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:25:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:25:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:48.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:25:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:49.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:50.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:51.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:51 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e137 e137: 3 total, 3 up, 3 in
Nov 29 01:25:51 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 01:25:51 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 01:25:51 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 29 01:25:52 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:25:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:52.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:52 np0005539510 python3.9[101332]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Nov 29 01:25:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:53.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:53 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:25:54 np0005539510 python3.9[101485]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:25:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:25:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:54.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:25:54 np0005539510 python3.9[101597]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:25:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:55.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:55 np0005539510 python3.9[101766]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:25:56 np0005539510 python3.9[101844]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:25:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:25:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:56.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:25:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:25:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:57.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:25:57 np0005539510 python3.9[101997]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:25:58 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:25:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:25:58.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:25:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:25:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:25:59.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:25:59 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e138 e138: 3 total, 3 up, 3 in
Nov 29 01:25:59 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:26:00 np0005539510 python3.9[102150]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:26:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.002000055s ======
Nov 29 01:26:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:00.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000055s
Nov 29 01:26:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:01.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:01 np0005539510 python3.9[102285]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:26:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:26:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:02.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:26:02 np0005539510 systemd[1]: session-37.scope: Deactivated successfully.
Nov 29 01:26:02 np0005539510 systemd[1]: session-37.scope: Consumed 23.554s CPU time.
Nov 29 01:26:02 np0005539510 systemd-logind[784]: Session 37 logged out. Waiting for processes to exit.
Nov 29 01:26:02 np0005539510 systemd-logind[784]: Removed session 37.
Nov 29 01:26:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:03.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:03 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:26:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:04.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:04 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 e139: 3 total, 3 up, 3 in
Nov 29 01:26:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:05.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:26:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:06.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:26:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:26:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:07.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:26:08 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:26:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:08.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:08 np0005539510 systemd-logind[784]: New session 38 of user zuul.
Nov 29 01:26:08 np0005539510 systemd[1]: Started Session 38 of User zuul.
Nov 29 01:26:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:09.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:09 np0005539510 python3.9[102471]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:10 np0005539510 python3.9[102624]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:26:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:10.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:26:10 np0005539510 python3.9[102702]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:11.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:11 np0005539510 systemd-logind[784]: Session 38 logged out. Waiting for processes to exit.
Nov 29 01:26:11 np0005539510 systemd[1]: session-38.scope: Deactivated successfully.
Nov 29 01:26:11 np0005539510 systemd[1]: session-38.scope: Consumed 1.434s CPU time.
Nov 29 01:26:11 np0005539510 systemd-logind[784]: Removed session 38.
Nov 29 01:26:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:26:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:12.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:26:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:13.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:13 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:26:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:26:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:14.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:26:14 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Nov 29 01:26:14 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:26:14.662095) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 01:26:14 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Nov 29 01:26:14 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397574662170, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 7900, "num_deletes": 256, "total_data_size": 16721993, "memory_usage": 16933328, "flush_reason": "Manual Compaction"}
Nov 29 01:26:14 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Nov 29 01:26:14 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397574730915, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 10246102, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 257, "largest_seqno": 7905, "table_properties": {"data_size": 10212502, "index_size": 22413, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10117, "raw_key_size": 96628, "raw_average_key_size": 24, "raw_value_size": 10133446, "raw_average_value_size": 2519, "num_data_blocks": 982, "num_entries": 4022, "num_filter_entries": 4022, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397158, "oldest_key_time": 1764397158, "file_creation_time": 1764397574, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:26:14 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 69148 microseconds, and 20658 cpu microseconds.
Nov 29 01:26:14 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:26:14.731246) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 10246102 bytes OK
Nov 29 01:26:14 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:26:14.731336) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Nov 29 01:26:14 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:26:14.732553) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Nov 29 01:26:14 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:26:14.732570) EVENT_LOG_v1 {"time_micros": 1764397574732564, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Nov 29 01:26:14 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:26:14.732592) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Nov 29 01:26:14 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 16679049, prev total WAL file size 16679684, number of live WAL files 2.
Nov 29 01:26:14 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:26:14 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:26:14.736486) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Nov 29 01:26:14 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Nov 29 01:26:14 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(10005KB) 8(1648B)]
Nov 29 01:26:14 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397574736648, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 10247750, "oldest_snapshot_seqno": -1}
Nov 29 01:26:14 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 3769 keys, 10242318 bytes, temperature: kUnknown
Nov 29 01:26:14 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397574808595, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 10242318, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10209411, "index_size": 22365, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9477, "raw_key_size": 92412, "raw_average_key_size": 24, "raw_value_size": 10133504, "raw_average_value_size": 2688, "num_data_blocks": 982, "num_entries": 3769, "num_filter_entries": 3769, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397158, "oldest_key_time": 0, "file_creation_time": 1764397574, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:26:14 np0005539510 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:26:14 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:26:14.808946) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 10242318 bytes
Nov 29 01:26:14 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:26:14.810573) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 142.3 rd, 142.2 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(9.8, 0.0 +0.0 blob) out(9.8 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 4027, records dropped: 258 output_compression: NoCompression
Nov 29 01:26:14 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:26:14.810604) EVENT_LOG_v1 {"time_micros": 1764397574810591, "job": 4, "event": "compaction_finished", "compaction_time_micros": 72031, "compaction_time_cpu_micros": 23599, "output_level": 6, "num_output_files": 1, "total_output_size": 10242318, "num_input_records": 4027, "num_output_records": 3769, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 01:26:14 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:26:14 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397574813113, "job": 4, "event": "table_file_deletion", "file_number": 14}
Nov 29 01:26:14 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:26:14 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397574813183, "job": 4, "event": "table_file_deletion", "file_number": 8}
Nov 29 01:26:14 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:26:14.736219) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:26:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:15.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:26:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:16.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:26:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:17.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:17 np0005539510 systemd-logind[784]: New session 39 of user zuul.
Nov 29 01:26:17 np0005539510 systemd[1]: Started Session 39 of User zuul.
Nov 29 01:26:18 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:26:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:18.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:18 np0005539510 python3.9[102886]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:26:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:19.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:26:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:20.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:26:20 np0005539510 python3.9[103043]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:21.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:21 np0005539510 python3.9[103268]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:22 np0005539510 python3.9[103346]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.spqdks13 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:22.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:23.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:23 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:26:23 np0005539510 python3.9[103499]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:24 np0005539510 python3.9[103577]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=._4aepc4w recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:24.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:25 np0005539510 python3.9[103730]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:26:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:25.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:26 np0005539510 python3.9[103882]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:26:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:26.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:26:26 np0005539510 python3.9[103961]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:26:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:27.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:27 np0005539510 python3.9[104113]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:27 np0005539510 python3.9[104193]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:26:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:28.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:28 np0005539510 python3.9[104346]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:29.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:29 np0005539510 python3.9[104498]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:29 np0005539510 python3.9[104576]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:30 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:26:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:30.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:30 np0005539510 python3.9[104729]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:31.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:31 np0005539510 python3.9[104807]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:32 np0005539510 python3.9[104959]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:26:32 np0005539510 systemd[1]: Reloading.
Nov 29 01:26:32 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:26:32 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:26:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:32.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:33.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:33 np0005539510 python3.9[105149]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:34 np0005539510 python3.9[105227]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:26:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:34.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:26:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:26:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:35.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:26:35 np0005539510 python3.9[105380]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:35 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:26:35 np0005539510 python3.9[105458]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:36 np0005539510 python3.9[105610]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:26:36 np0005539510 systemd[1]: Reloading.
Nov 29 01:26:36 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:26:36 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:26:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:36.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:36 np0005539510 systemd[1]: Starting Create netns directory...
Nov 29 01:26:36 np0005539510 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 01:26:36 np0005539510 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 01:26:36 np0005539510 systemd[1]: Finished Create netns directory.
Nov 29 01:26:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:37.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:37 np0005539510 python3.9[105802]: ansible-ansible.builtin.service_facts Invoked
Nov 29 01:26:37 np0005539510 network[105819]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:26:37 np0005539510 network[105820]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:26:37 np0005539510 network[105821]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:26:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:38.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:39.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:40 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:26:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:26:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:40.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:26:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:41.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:26:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:42.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:26:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:43.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:44.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:45.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:26:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:46.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:26:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:47.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:48 np0005539510 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 01:26:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:26:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:48.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:26:49 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:26:49 np0005539510 python3.9[106137]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:49.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:49 np0005539510 python3.9[106218]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:26:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:50.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:26:50 np0005539510 python3.9[106371]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:26:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:51.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:26:51 np0005539510 python3.9[106523]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:52 np0005539510 python3.9[106601]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:52 np0005539510 ceph-mon[77142]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Nov 29 01:26:52 np0005539510 ceph-mon[77142]: paxos.1).electionLogic(15) init, last seen epoch 15, mid-election, bumping
Nov 29 01:26:52 np0005539510 ceph-mon[77142]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:26:52 np0005539510 ceph-mon[77142]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:26:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:52.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:53 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:26:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:53.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:53 np0005539510 python3.9[106754]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 29 01:26:53 np0005539510 systemd[1]: Starting Time & Date Service...
Nov 29 01:26:53 np0005539510 systemd[1]: Started Time & Date Service.
Nov 29 01:26:54 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:26:54 np0005539510 python3.9[106910]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:26:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:54.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:26:54 np0005539510 ceph-mon[77142]: mon.compute-1 calling monitor election
Nov 29 01:26:54 np0005539510 ceph-mon[77142]: mon.compute-0 calling monitor election
Nov 29 01:26:54 np0005539510 ceph-mon[77142]: mon.compute-2 calling monitor election
Nov 29 01:26:54 np0005539510 ceph-mon[77142]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Nov 29 01:26:54 np0005539510 ceph-mon[77142]: overall HEALTH_OK
Nov 29 01:26:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:55.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:55 np0005539510 python3.9[107200]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:55 np0005539510 podman[107233]: 2025-11-29 06:26:55.275547334 +0000 UTC m=+0.087465020 container exec 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Nov 29 01:26:55 np0005539510 podman[107233]: 2025-11-29 06:26:55.393258009 +0000 UTC m=+0.205175675 container exec_died 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Nov 29 01:26:55 np0005539510 python3.9[107359]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:56 np0005539510 podman[107489]: 2025-11-29 06:26:56.010305044 +0000 UTC m=+0.086074661 container exec e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 01:26:56 np0005539510 podman[107536]: 2025-11-29 06:26:56.087996403 +0000 UTC m=+0.062074748 container exec_died e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 01:26:56 np0005539510 podman[107489]: 2025-11-29 06:26:56.16742006 +0000 UTC m=+0.243189647 container exec_died e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 01:26:56 np0005539510 python3.9[107661]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:56 np0005539510 podman[107683]: 2025-11-29 06:26:56.556191832 +0000 UTC m=+0.216098438 container exec d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, distribution-scope=public, io.buildah.version=1.28.2, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, build-date=2023-02-22T09:23:20, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4)
Nov 29 01:26:56 np0005539510 podman[107683]: 2025-11-29 06:26:56.566941579 +0000 UTC m=+0.226848175 container exec_died d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, version=2.2.4, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=keepalived for Ceph, distribution-scope=public, name=keepalived, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, release=1793, vendor=Red Hat, Inc.)
Nov 29 01:26:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:56.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:56 np0005539510 python3.9[107792]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.uon86pot recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:57.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:57 np0005539510 python3.9[107944]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:58 np0005539510 python3.9[108022]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:26:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:26:58.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:26:59 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:26:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:26:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000038s ======
Nov 29 01:26:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:26:59.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000038s
Nov 29 01:26:59 np0005539510 python3.9[108175]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:27:00 np0005539510 python3[108460]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 29 01:27:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000039s ======
Nov 29 01:27:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:00.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000039s
Nov 29 01:27:01 np0005539510 python3.9[108613]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:01.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:01 np0005539510 python3.9[108741]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:02 np0005539510 python3.9[108893]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:02.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:02 np0005539510 python3.9[108972]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000038s ======
Nov 29 01:27:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:03.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000038s
Nov 29 01:27:03 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:27:03 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:27:03 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:27:03 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:27:03 np0005539510 python3.9[109124]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:04 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:27:04 np0005539510 python3.9[109202]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:04.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:05 np0005539510 python3.9[109355]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:05.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:05 np0005539510 python3.9[109433]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:06.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:06 np0005539510 python3.9[109586]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:07.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:07 np0005539510 python3.9[109664]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000039s ======
Nov 29 01:27:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:08.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000039s
Nov 29 01:27:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000038s ======
Nov 29 01:27:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:09.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000038s
Nov 29 01:27:09 np0005539510 python3.9[109817]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:27:10 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:27:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:10.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:10 np0005539510 python3.9[109973]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:11.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:11 np0005539510 python3.9[110125]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:12 np0005539510 python3.9[110277]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:12 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:27:12 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:27:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:12.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:13.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:13 np0005539510 python3.9[110430]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 29 01:27:13 np0005539510 python3.9[110582]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 29 01:27:14 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:27:14 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:27:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000038s ======
Nov 29 01:27:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:14.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000038s
Nov 29 01:27:14 np0005539510 systemd[1]: session-39.scope: Deactivated successfully.
Nov 29 01:27:14 np0005539510 systemd[1]: session-39.scope: Consumed 30.571s CPU time.
Nov 29 01:27:14 np0005539510 systemd-logind[784]: Session 39 logged out. Waiting for processes to exit.
Nov 29 01:27:14 np0005539510 systemd-logind[784]: Removed session 39.
Nov 29 01:27:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000038s ======
Nov 29 01:27:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:15.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000038s
Nov 29 01:27:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:16.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:17 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:27:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:17.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:18.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:18 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:27:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:19.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:20.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:21 np0005539510 systemd-logind[784]: New session 40 of user zuul.
Nov 29 01:27:21 np0005539510 systemd[1]: Started Session 40 of User zuul.
Nov 29 01:27:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:21.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:21 np0005539510 python3.9[110767]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 29 01:27:22 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:27:22 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:27:22 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:27:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:22.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:22 np0005539510 python3.9[111020]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:27:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:23.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:23 np0005539510 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 29 01:27:23 np0005539510 python3.9[111174]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Nov 29 01:27:24 np0005539510 python3.9[111328]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.bamr9k_4 follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:24.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:25 np0005539510 python3.9[111454]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.bamr9k_4 mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397643.9400768-110-76035653769870/.source.bamr9k_4 _original_basename=.qaqv7d3h follow=False checksum=b291f010aefff8b88f41011b780271a83fd1182f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:25.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:26 np0005539510 python3.9[111606]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:27:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:26.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:27.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:27 np0005539510 python3.9[111759]: ansible-ansible.builtin.blockinfile Invoked with block=compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC2GXKCQiCwQEMihcSwDVeJtG2CpTemmA6MTbtOkxbB3OAV5PK8v8imPvDGMDurfGFQG0RzWyv9szlMJXdgIkwejIfy/AY7p6nemHOpu6DdAx0EA/jg1YcOIeeEhyMw1/oFzjYClGMohaI1oTKHtR29UXWphTAroOkf26Exvco6hh2ApRTXV9ObzSoOyCC7+OZcOWgYzdoCfu/0FDGkH2ksKLQS7d4AAh/XZ/njXhK57U7ptxHCReUPECGRv7KB4f8TelZDAIeUyp7ngd/9ivUDO1zue1Qr9ECzTzAFqippGXFmYl3+oSid03CY7bqnxav4xWt7UukbaO57goyIPfkklPdC1kA7kZqa9bqeDU1WgDkqnLu8hluArB0Y0Jz+hDfx9pTbAL6MklraoLaGrnrgcibAollAN+7WGqdWxUotENYaljO7P1Z18MlNllWFzk4Le5jMLNL8qArSlzM+ufOThnLdGEuYZhH1x969AisGQ4MQWn0P0lZFu6fE5VSNA/k=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDdPWx5WoFJTxz6PiFZL5f3XrtE682RjGFiIpoe0LXZO#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFQlZMweHfLYiJFtm1r2tQze/oNx6KzgaXkK+Kof7POk0cFMLbTsXU8qgbQMh4o5LVO0Hbas4mAqxRkGcFCg2Po=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCX0dhB1m0xL0qEi5jnTQLLB4bvueVV5foNrqU/OkfV/4gRyp7uP2q21lWq5Dtl2GLk51pS6oD41RI41Y5g7OSRs8b1Z66d6X1QgX0Qns6pv7FwmNSQ25+2VGV6lppnaN5e+JHiwTmzpf82hl/MiiJrHo7B63mllKyl9SZJxUhP9RR4czS3QNYQsZyP7sZeCWothTZ2Q/GK4BWBEtj2+ifeOpa342IivopCH05YVQOx9bpsdFHMYaalMDCwvr2lfVns8aTcpJ3z9uE8wLdKWTyiinT7nuLX6RuPwhXB2proBRH1wrGSIUgcVcizkWn8QizD8LlsGFcHIQJkmq+sJz6r7cCZLIfS6hdAzI+hYbJie6n/agwfxe4r+mbXsmmC6ALKKk7CEnaiNnDg0fgTaUfBPwSfu+JmVrjdSO+S8f/CMbtYeO6QknOxhLV9oK6knszv7nLlSYXTzXanHkN4Y0fW3dsSvoE+qDR0YijbbT8slqMd6z95wWVDFUmTcN8Nzk8=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILci1PI4hoB56+xxS5gSMKceuJ/dv6t7etpmtENwoSFr#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJIaOLr2ntjSUcigXC7a0sFoonsuh0ChCx2a1R6G8EDmJ8/ZB8NEiJE6KAQJDNU5XsXjuaC44eJhOUMRK9r98xA=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUVpPatup3d17omeiTdJaYR8jCcDbraJSPBxWy49Wxst4G+6/lD41HVIKmjgCgIbbmYSFBPQmoXt4gFXP4FRKna6AbQWi0kwF3/T2biQ2qCid0HVDSS8YRVlyrpdVc1/bIg6YNLkGnhzOMp0S1443+cg5PqutAbrAT1LOg6lSBu+K9gIqJ4un3l2guSweoyba5UhMyjrq4Pffx1QCuBggtYSjmA9Q1r5VVNc2J7AbP0QuzOe6J6DhpdGJsfmHDVXZb/4b/aPUdCTKkLseyUtcqElWVhhnGnpYSJdN81ejalSktGHE4JRHih19wwTokiKvoczUgijBzOfl+kt2ELcpDgzpzY0M9yd0Zz7wrK4rLM6hi8x3LYZXZv8N7KnawUcJ2jfzilx1BVLdNzgwDNB7ZlP4O9Vs3fKnBufCUFPNcRyWl6ooczepbgxqgSbr/Ham2O4/qzvJmzLtu0KxBkaFALRWnyM39nYVE/jrMKJ5ihtVDxIY9FGma/Jifg15gqI0=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIN19pK3a7AH/OiwlqJTVWP/qzU/QzkC16s4D1xY1Vn6J#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLsXsjJNPVMX1YVTe2oBmcZpUSiv3HOeuICgZtQun4hTopMXH9dE1jQeUruGwqZ+NsKW6X2bLZZJ0/tcn2owL8Q=#012 create=True mode=0644 path=/tmp/ansible.bamr9k_4 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:28 np0005539510 python3.9[111911]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.bamr9k_4' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:27:28 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:27:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:27:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:28.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:27:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:29.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:29 np0005539510 python3.9[112066]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.bamr9k_4 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:30 np0005539510 systemd[1]: session-40.scope: Deactivated successfully.
Nov 29 01:27:30 np0005539510 systemd[1]: session-40.scope: Consumed 4.821s CPU time.
Nov 29 01:27:30 np0005539510 systemd-logind[784]: Session 40 logged out. Waiting for processes to exit.
Nov 29 01:27:30 np0005539510 systemd-logind[784]: Removed session 40.
Nov 29 01:27:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:30.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:31.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:27:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:32.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:27:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:33.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:33 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:27:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:34.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:34 np0005539510 systemd-logind[784]: New session 41 of user zuul.
Nov 29 01:27:34 np0005539510 systemd[1]: Started Session 41 of User zuul.
Nov 29 01:27:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:35.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:36 np0005539510 python3.9[112247]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:27:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:36.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:27:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:37.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:27:37 np0005539510 python3.9[112404]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 29 01:27:38 np0005539510 python3.9[112558]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:27:38 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:27:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:38.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:39.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:40 np0005539510 python3.9[112712]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:27:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:40.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:41.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:41 np0005539510 python3.9[112866]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:27:42 np0005539510 python3.9[113066]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:42.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:42 np0005539510 systemd[1]: session-41.scope: Deactivated successfully.
Nov 29 01:27:42 np0005539510 systemd[1]: session-41.scope: Consumed 3.983s CPU time.
Nov 29 01:27:42 np0005539510 systemd-logind[784]: Session 41 logged out. Waiting for processes to exit.
Nov 29 01:27:42 np0005539510 systemd-logind[784]: Removed session 41.
Nov 29 01:27:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:43.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:43 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:27:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:44.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:27:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:45.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:27:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:27:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:46.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:27:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:47.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:48 np0005539510 systemd-logind[784]: New session 42 of user zuul.
Nov 29 01:27:48 np0005539510 systemd[1]: Started Session 42 of User zuul.
Nov 29 01:27:48 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:27:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:48.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:49 np0005539510 python3.9[113250]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:27:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:49.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:50 np0005539510 python3.9[113406]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:27:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:50.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:27:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:51.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:27:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:27:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:52.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:27:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:53.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:54.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:55.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:55 np0005539510 python3.9[113491]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 01:27:56 np0005539510 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 01:27:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:27:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:56.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:27:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:57.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:27:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:27:58.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:27:59 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:27:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:27:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:27:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:27:59.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:28:00 np0005539510 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 01:28:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:00.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:01.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:02 np0005539510 python3.9[113650]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:28:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:02.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:03 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).paxos(paxos updating c 252..820) lease_timeout -- calling new election
Nov 29 01:28:03 np0005539510 ceph-mon[77142]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Nov 29 01:28:03 np0005539510 ceph-mon[77142]: paxos.1).electionLogic(18) init, last seen epoch 18
Nov 29 01:28:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:03.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:03 np0005539510 ceph-mon[77142]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:28:03 np0005539510 ceph-mon[77142]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:28:03 np0005539510 python3.9[113852]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 01:28:04 np0005539510 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy MDS connection to Monitors appears to be laggy; 15.7604s since last acked beacon
Nov 29 01:28:04 np0005539510 ceph-mds[83861]: mds.0.4 skipping upkeep work because connection to Monitors appears laggy
Nov 29 01:28:04 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:28:04 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:28:04 np0005539510 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy  MDS is no longer laggy
Nov 29 01:28:04 np0005539510 python3.9[114003]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:28:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:04.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:28:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:05.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:28:05 np0005539510 python3.9[114153]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:28:06 np0005539510 systemd[1]: session-42.scope: Deactivated successfully.
Nov 29 01:28:06 np0005539510 systemd[1]: session-42.scope: Consumed 6.193s CPU time.
Nov 29 01:28:06 np0005539510 systemd-logind[784]: Session 42 logged out. Waiting for processes to exit.
Nov 29 01:28:06 np0005539510 systemd-logind[784]: Removed session 42.
Nov 29 01:28:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:06.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:07.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:08 np0005539510 ceph-mon[77142]: mon.compute-2 calling monitor election
Nov 29 01:28:08 np0005539510 ceph-mon[77142]: mon.compute-0 calling monitor election
Nov 29 01:28:08 np0005539510 ceph-mon[77142]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Nov 29 01:28:08 np0005539510 ceph-mon[77142]: overall HEALTH_OK
Nov 29 01:28:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:08.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:09.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:09 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:28:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:28:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:10.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:28:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:11.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:11 np0005539510 systemd-logind[784]: New session 43 of user zuul.
Nov 29 01:28:11 np0005539510 systemd[1]: Started Session 43 of User zuul.
Nov 29 01:28:12 np0005539510 python3.9[114335]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:28:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:12.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:13.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:14 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:28:14 np0005539510 python3.9[114492]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:28:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:28:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:14.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:28:15 np0005539510 python3.9[114644]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:28:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:15.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:16 np0005539510 python3.9[114796]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:28:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:16.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:28:16 np0005539510 python3.9[114920]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397695.4294388-160-227687559416352/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=14b9bbfa9929911e1123ed6fe048b8e915417748 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:17.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:17 np0005539510 python3.9[115072]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:18 np0005539510 python3.9[115195]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397696.9737144-160-64058702470122/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=03c2952c2692ca442730881904078ac3e566f340 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:18 np0005539510 python3.9[115348]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:18.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:19 np0005539510 python3.9[115471]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397698.1790023-160-13165336933269/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=be9a231ca8cb9d5c8a85bd82f4d8528bcb487e51 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:19 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:28:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:19.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:19 np0005539510 python3.9[115623]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:28:20 np0005539510 python3.9[115775]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:28:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:28:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:20.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:28:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:21.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:21 np0005539510 python3.9[115928]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:21 np0005539510 python3.9[116051]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397700.6391497-322-189162828707543/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=d83e9ef310607793aac5272a5dd3ed54e63fe338 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:22 np0005539510 python3.9[116203]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:28:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:22.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:28:23 np0005539510 python3.9[116500]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397702.0420837-322-120017904376708/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=446989bd92736b57ebc923ce429d8effafd00e68 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:23 np0005539510 podman[116550]: 2025-11-29 06:28:23.097658266 +0000 UTC m=+0.059937363 container exec 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 01:28:23 np0005539510 podman[116550]: 2025-11-29 06:28:23.198479232 +0000 UTC m=+0.160758319 container exec_died 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 01:28:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:23.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:23 np0005539510 python3.9[116782]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:24 np0005539510 podman[116871]: 2025-11-29 06:28:24.154354169 +0000 UTC m=+0.429044514 container exec e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 01:28:24 np0005539510 podman[116871]: 2025-11-29 06:28:24.167275031 +0000 UTC m=+0.441965376 container exec_died e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 01:28:24 np0005539510 python3.9[116984]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397703.154555-322-33864050510225/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=5d5a903db1eca232a57ca76aa1a372ced69c51b8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:24 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:28:24 np0005539510 podman[117037]: 2025-11-29 06:28:24.538998362 +0000 UTC m=+0.201944369 container exec d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, vendor=Red Hat, Inc., name=keepalived, release=1793, io.openshift.tags=Ceph keepalived, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2023-02-22T09:23:20, version=2.2.4, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, io.openshift.expose-services=)
Nov 29 01:28:24 np0005539510 podman[117106]: 2025-11-29 06:28:24.635775738 +0000 UTC m=+0.074162731 container exec_died d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, description=keepalived for Ceph, distribution-scope=public, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.tags=Ceph keepalived, name=keepalived, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, vcs-type=git)
Nov 29 01:28:24 np0005539510 podman[117037]: 2025-11-29 06:28:24.692555174 +0000 UTC m=+0.355501181 container exec_died d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., architecture=x86_64, version=2.2.4, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, build-date=2023-02-22T09:23:20, io.openshift.expose-services=, name=keepalived, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 01:28:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:24.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:25 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:28:25 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:28:25 np0005539510 python3.9[117223]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:28:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:25.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:25 np0005539510 python3.9[117475]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:28:26 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:28:26 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:28:26 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:28:26 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:28:26 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:28:26 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:28:26 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:28:26 np0005539510 python3.9[117659]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:26.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:27 np0005539510 python3.9[117782]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397706.120503-475-106873351317592/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=4ccf2634f20abca04ee2090faa470941e7667ac5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:27.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:27 np0005539510 python3.9[117934]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:28 np0005539510 python3.9[118057]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397707.288686-475-84753795137553/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=446989bd92736b57ebc923ce429d8effafd00e68 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:28.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:28 np0005539510 python3.9[118210]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:29 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:28:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:29.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:29 np0005539510 python3.9[118333]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397708.4817088-475-251589301346846/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=e97a0024800c75a2251bda4519fe7a3e8494189f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:30.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:30 np0005539510 python3.9[118486]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:28:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:31.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:31 np0005539510 python3.9[118638]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:32 np0005539510 python3.9[118761]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397711.0869498-645-50314244674820/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3385b01217fece5877d0a0cc7f45f60761b1d6d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:28:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:32.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:28:32 np0005539510 python3.9[118914]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:28:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:33.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:33 np0005539510 python3.9[119066]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:34 np0005539510 python3.9[119189]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397713.1532283-729-236043614670945/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3385b01217fece5877d0a0cc7f45f60761b1d6d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:34 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:28:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:34.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:35 np0005539510 python3.9[119342]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:28:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:35.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:35 np0005539510 python3.9[119494]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:36 np0005539510 python3.9[119617]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397715.219514-801-7912416078450/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3385b01217fece5877d0a0cc7f45f60761b1d6d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:36.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:36 np0005539510 python3.9[119770]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:28:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:37.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:37 np0005539510 python3.9[119922]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:38 np0005539510 python3.9[120045]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397717.0958455-868-180624767154835/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3385b01217fece5877d0a0cc7f45f60761b1d6d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:38.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:39 np0005539510 python3.9[120198]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:28:39 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:28:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:39.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:39 np0005539510 python3.9[120350]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:40 np0005539510 python3.9[120475]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397719.410827-938-74257077186553/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3385b01217fece5877d0a0cc7f45f60761b1d6d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:40.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:41 np0005539510 python3.9[120628]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:28:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:41.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:41 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:28:41 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:28:41 np0005539510 python3.9[120780]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:42 np0005539510 python3.9[120953]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397721.3963447-1010-175397569019269/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3385b01217fece5877d0a0cc7f45f60761b1d6d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:28:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:42.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:28:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:43.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:44 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:28:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:44.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.005000136s ======
Nov 29 01:28:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:45.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.005000136s
Nov 29 01:28:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:46.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:47 np0005539510 systemd[1]: session-43.scope: Deactivated successfully.
Nov 29 01:28:47 np0005539510 systemd[1]: session-43.scope: Consumed 22.510s CPU time.
Nov 29 01:28:47 np0005539510 systemd-logind[784]: Session 43 logged out. Waiting for processes to exit.
Nov 29 01:28:47 np0005539510 systemd-logind[784]: Removed session 43.
Nov 29 01:28:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:28:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:47.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:28:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:28:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:48.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:28:49 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:28:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:49.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:50.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:51.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:52 np0005539510 systemd-logind[784]: New session 44 of user zuul.
Nov 29 01:28:52 np0005539510 systemd[1]: Started Session 44 of User zuul.
Nov 29 01:28:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:52.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:53.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:53 np0005539510 python3.9[121189]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:54 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:28:54 np0005539510 python3.9[121341]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:54.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:55 np0005539510 python3.9[121465]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397733.781479-69-25501589853031/.source.conf _original_basename=ceph.conf follow=False checksum=b678e866ce48244e104f356f74865d3398155ff0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:55.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:55 np0005539510 python3.9[121617]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:56 np0005539510 python3.9[121740]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397735.3108172-69-208818788610847/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=d5bc1b1c0617b147c8e3e13846b179249a244079 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.748428) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397736748481, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 1351, "num_deletes": 253, "total_data_size": 3239363, "memory_usage": 3272648, "flush_reason": "Manual Compaction"}
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397736761353, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 1349524, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7910, "largest_seqno": 9256, "table_properties": {"data_size": 1344764, "index_size": 2156, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 12194, "raw_average_key_size": 20, "raw_value_size": 1334447, "raw_average_value_size": 2246, "num_data_blocks": 99, "num_entries": 594, "num_filter_entries": 594, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397574, "oldest_key_time": 1764397574, "file_creation_time": 1764397736, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 12980 microseconds, and 4308 cpu microseconds.
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.761406) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 1349524 bytes OK
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.761433) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.764326) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.764356) EVENT_LOG_v1 {"time_micros": 1764397736764347, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.764382) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 3232965, prev total WAL file size 3232965, number of live WAL files 2.
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.765227) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323534' seq:0, type:0; will stop at (end)
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(1317KB)], [15(10002KB)]
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397736765267, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 11591842, "oldest_snapshot_seqno": -1}
Nov 29 01:28:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:56.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 3885 keys, 9449033 bytes, temperature: kUnknown
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397736827915, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 9449033, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9417566, "index_size": 20669, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9733, "raw_key_size": 95386, "raw_average_key_size": 24, "raw_value_size": 9341653, "raw_average_value_size": 2404, "num_data_blocks": 911, "num_entries": 3885, "num_filter_entries": 3885, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397158, "oldest_key_time": 0, "file_creation_time": 1764397736, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.828215) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 9449033 bytes
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.829578) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 184.8 rd, 150.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 9.8 +0.0 blob) out(9.0 +0.0 blob), read-write-amplify(15.6) write-amplify(7.0) OK, records in: 4363, records dropped: 478 output_compression: NoCompression
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.829599) EVENT_LOG_v1 {"time_micros": 1764397736829588, "job": 6, "event": "compaction_finished", "compaction_time_micros": 62741, "compaction_time_cpu_micros": 25673, "output_level": 6, "num_output_files": 1, "total_output_size": 9449033, "num_input_records": 4363, "num_output_records": 3885, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397736829925, "job": 6, "event": "table_file_deletion", "file_number": 17}
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397736831844, "job": 6, "event": "table_file_deletion", "file_number": 15}
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.765156) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.831876) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.831883) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.831885) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.831887) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:28:56 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:28:56.831889) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:28:57 np0005539510 systemd[1]: session-44.scope: Deactivated successfully.
Nov 29 01:28:57 np0005539510 systemd[1]: session-44.scope: Consumed 2.584s CPU time.
Nov 29 01:28:57 np0005539510 systemd-logind[784]: Session 44 logged out. Waiting for processes to exit.
Nov 29 01:28:57 np0005539510 systemd-logind[784]: Removed session 44.
Nov 29 01:28:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:57.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:28:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:28:58.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:28:59 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:28:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:28:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:28:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:28:59.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:29:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:00.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:01.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:02.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:02 np0005539510 systemd-logind[784]: New session 45 of user zuul.
Nov 29 01:29:02 np0005539510 systemd[1]: Started Session 45 of User zuul.
Nov 29 01:29:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:03.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:03 np0005539510 python3.9[121972]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:29:04 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:29:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:04.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:05.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:05 np0005539510 python3.9[122129]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:06 np0005539510 python3.9[122281]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:06.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:07 np0005539510 python3.9[122432]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:29:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:07.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:08 np0005539510 python3.9[122584]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 29 01:29:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:08.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:09.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:09 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:29:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:10.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:11.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:12.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:13.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:13 np0005539510 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Nov 29 01:29:14 np0005539510 python3.9[122743]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:29:14 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:29:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:14.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:15 np0005539510 python3.9[122828]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:29:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:15.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:16.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:17.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:18 np0005539510 python3.9[122982]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:29:18 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 01:29:18 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 1366 writes, 9395 keys, 1366 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 1366 writes, 1366 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1366 writes, 9395 keys, 1366 commit groups, 1.0 writes per commit group, ingest: 19.37 MB, 0.03 MB/s#012Interval WAL: 1366 writes, 1366 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    131.8      0.08              0.02         3    0.028       0      0       0.0       0.0#012  L6      1/0    9.01 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.7    154.5    139.3      0.13              0.05         2    0.067    8390    736       0.0       0.0#012 Sum      1/0    9.01 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   2.7     95.2    136.5      0.22              0.07         5    0.044    8390    736       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   2.7     96.0    137.6      0.22              0.07         4    0.054    8390    736       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0    154.5    139.3      0.13              0.05         2    0.067    8390    736       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    134.6      0.08              0.02         2    0.041       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.011, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.03 GB write, 0.05 MB/s write, 0.02 GB read, 0.03 MB/s read, 0.2 seconds#012Interval compaction: 0.03 GB write, 0.05 MB/s write, 0.02 GB read, 0.03 MB/s read, 0.2 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55be896f31f0#2 capacity: 304.00 MB usage: 767.56 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 6.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(32,660.91 KB,0.212308%) FilterBlock(5,31.98 KB,0.0102746%) IndexBlock(5,74.67 KB,0.0239874%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Nov 29 01:29:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:18.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:19.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:19 np0005539510 python3[123138]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Nov 29 01:29:19 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:29:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:20.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:21.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:21 np0005539510 python3.9[123291]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:22 np0005539510 python3.9[123443]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:22 np0005539510 python3.9[123522]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:22.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:23.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:23 np0005539510 python3.9[123724]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:23 np0005539510 python3.9[123802]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.x3ahtaus recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:24 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:29:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:24.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:25 np0005539510 python3.9[123955]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:25.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:25 np0005539510 python3.9[124033]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:26 np0005539510 python3.9[124185]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:29:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:26.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:27.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:28 np0005539510 python3[124339]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 29 01:29:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:28.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:28 np0005539510 python3.9[124492]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:29.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:29 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:29:29 np0005539510 python3.9[124617]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397768.4435725-438-159037064962300/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:30.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:31 np0005539510 python3.9[124770]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:31.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:31 np0005539510 python3.9[124895]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397770.19629-483-20650074064733/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:32 np0005539510 python3.9[125047]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:29:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:32.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:29:33 np0005539510 python3.9[125173]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397772.0220683-529-179184675933415/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:33.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:34 np0005539510 python3.9[125325]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:34 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:29:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:34.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:35 np0005539510 python3.9[125451]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397773.7340424-574-126815727553777/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:35.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:36 np0005539510 python3.9[125603]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:36.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:37 np0005539510 python3.9[125729]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397775.5305605-619-232983562042340/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:37.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:37 np0005539510 python3.9[125881]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:38 np0005539510 python3.9[126034]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:29:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:38.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:39.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:39 np0005539510 python3.9[126189]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:39 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:29:40 np0005539510 python3.9[126342]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:29:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:40.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:41.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:42 np0005539510 python3.9[126495]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:29:42 np0005539510 python3.9[126770]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:29:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:42.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:43.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:43 np0005539510 python3.9[126975]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:44 np0005539510 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 01:29:44 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:29:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:44.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:45.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:45 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Nov 29 01:29:45 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Nov 29 01:29:46 np0005539510 python3.9[127126]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:29:46 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:29:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:46.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:47.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:47 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd='[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]': finished
Nov 29 01:29:47 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:29:47 np0005539510 podman[127326]: 2025-11-29 06:29:47.940451235 +0000 UTC m=+0.750553221 container exec 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Nov 29 01:29:48 np0005539510 podman[127326]: 2025-11-29 06:29:48.038125393 +0000 UTC m=+0.848227369 container exec_died 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Nov 29 01:29:48 np0005539510 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 01:29:48 np0005539510 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 5353 writes, 23K keys, 5353 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 5353 writes, 712 syncs, 7.52 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5353 writes, 23K keys, 5353 commit groups, 1.0 writes per commit group, ingest: 18.68 MB, 0.03 MB/s#012Interval WAL: 5353 writes, 712 syncs, 7.52 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_s
Nov 29 01:29:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:48.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:49 np0005539510 podman[127532]: 2025-11-29 06:29:49.25847617 +0000 UTC m=+0.166948270 container exec e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 01:29:49 np0005539510 podman[127605]: 2025-11-29 06:29:49.33104155 +0000 UTC m=+0.057260588 container exec_died e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 01:29:49 np0005539510 podman[127532]: 2025-11-29 06:29:49.338567833 +0000 UTC m=+0.247039913 container exec_died e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 01:29:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:49.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:49 np0005539510 python3.9[127639]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:c6:22:5a:f7" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:29:49 np0005539510 ovs-vsctl[127672]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:c6:22:5a:f7 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Nov 29 01:29:49 np0005539510 podman[127673]: 2025-11-29 06:29:49.645345527 +0000 UTC m=+0.135979583 container exec d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, description=keepalived for Ceph, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=Ceph keepalived, release=1793, com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, build-date=2023-02-22T09:23:20, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., name=keepalived, architecture=x86_64)
Nov 29 01:29:49 np0005539510 podman[127717]: 2025-11-29 06:29:49.719956842 +0000 UTC m=+0.053296720 container exec_died d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, version=2.2.4, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, vcs-type=git, architecture=x86_64, description=keepalived for Ceph)
Nov 29 01:29:49 np0005539510 podman[127673]: 2025-11-29 06:29:49.756465128 +0000 UTC m=+0.247099164 container exec_died d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, io.buildah.version=1.28.2, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.tags=Ceph keepalived, architecture=x86_64, release=1793, com.redhat.component=keepalived-container, distribution-scope=public, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, version=2.2.4)
Nov 29 01:29:49 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:29:50 np0005539510 python3.9[127858]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:29:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:50.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:51.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:52.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:53 np0005539510 python3.9[128132]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:29:53 np0005539510 ovs-vsctl[128144]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Nov 29 01:29:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:53.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:53 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:29:53 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:29:54 np0005539510 python3.9[128294]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:29:54 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:29:54 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:29:54 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:29:54 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:29:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:54.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:55 np0005539510 python3.9[128449]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:55.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:56 np0005539510 python3.9[128601]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:56 np0005539510 python3.9[128680]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:29:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:56.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:29:57 np0005539510 python3.9[128832]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:57.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:57 np0005539510 python3.9[128910]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:58 np0005539510 python3.9[129063]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:29:58.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:29:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:29:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:29:59.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:29:59 np0005539510 python3.9[129215]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:59 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:30:00 np0005539510 python3.9[129293]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:00.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:01 np0005539510 python3.9[129446]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000028s ======
Nov 29 01:30:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:01.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 29 01:30:01 np0005539510 python3.9[129524]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:01 np0005539510 ceph-mon[77142]: overall HEALTH_OK
Nov 29 01:30:02 np0005539510 python3.9[129676]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:30:02 np0005539510 systemd[1]: Reloading.
Nov 29 01:30:02 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:30:02 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:30:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:02.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:30:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:03.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:30:03 np0005539510 python3.9[129916]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:04 np0005539510 python3.9[129994]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:04 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:30:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:04.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:05 np0005539510 python3.9[130147]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:05.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:05 np0005539510 python3.9[130225]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:06 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:30:06 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:30:06 np0005539510 python3.9[130377]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:30:06 np0005539510 systemd[1]: Reloading.
Nov 29 01:30:06 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:30:06 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:30:06 np0005539510 systemd[1]: Starting Create netns directory...
Nov 29 01:30:06 np0005539510 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 01:30:06 np0005539510 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 01:30:06 np0005539510 systemd[1]: Finished Create netns directory.
Nov 29 01:30:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:06.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:07.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:07 np0005539510 python3.9[130622]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:30:08 np0005539510 python3.9[130775]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:08.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:09 np0005539510 python3.9[130898]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397808.1627903-1371-62291109263272/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:30:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:09.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:09 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:30:10 np0005539510 python3.9[131050]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:30:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:10.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:11 np0005539510 python3.9[131203]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:11.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:11 np0005539510 python3.9[131326]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397810.6854649-1446-234358258156707/.source.json _original_basename=.b_0l47u2 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:12 np0005539510 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 01:30:12 np0005539510 python3.9[131479]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:12.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:13.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:14 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:30:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:14.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:15.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:15 np0005539510 python3.9[131908]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Nov 29 01:30:16 np0005539510 python3.9[132061]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 01:30:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:16.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:17.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:17 np0005539510 python3.9[132213]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 29 01:30:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:18.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:19.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:19 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:30:19 np0005539510 python3[132392]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 01:30:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:20.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:21.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:22.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:23.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:24 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:30:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:24.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:25.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:26.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:27.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:27 np0005539510 podman[132406]: 2025-11-29 06:30:27.560232244 +0000 UTC m=+7.634881301 image pull 52cb1910f3f090372807028d1c2aea98d2557b1086636469529f290368ecdf69 quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 29 01:30:27 np0005539510 podman[132578]: 2025-11-29 06:30:27.676805423 +0000 UTC m=+0.020905146 image pull 52cb1910f3f090372807028d1c2aea98d2557b1086636469529f290368ecdf69 quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 29 01:30:28 np0005539510 podman[132578]: 2025-11-29 06:30:28.431236407 +0000 UTC m=+0.775336110 container create 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 01:30:28 np0005539510 python3[132392]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 29 01:30:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:30:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:28.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:30:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:29.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:29 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:30:30 np0005539510 python3.9[132769]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:30:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:30.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:30:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:31.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:30:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:30:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:32.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:30:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:33.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:34 np0005539510 python3.9[132923]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:34 np0005539510 python3.9[133003]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:30:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:30:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:34.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:30:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:35.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:35 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:30:35 np0005539510 python3.9[133154]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764397834.9574444-1710-187878217196123/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:36 np0005539510 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 01:30:36 np0005539510 python3.9[133230]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:30:36 np0005539510 systemd[1]: Reloading.
Nov 29 01:30:36 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:30:36 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:30:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:36.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:37.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:38 np0005539510 python3.9[133343]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:30:38 np0005539510 systemd[1]: Reloading.
Nov 29 01:30:38 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:30:38 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:30:38 np0005539510 systemd[1]: Starting ovn_controller container...
Nov 29 01:30:38 np0005539510 systemd[1]: Started libcrun container.
Nov 29 01:30:38 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af8eceb0d62786f3349e40b8f178df504877b374937d019d99a45125a9ac3338/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 29 01:30:38 np0005539510 systemd[1]: Started /usr/bin/podman healthcheck run 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947.
Nov 29 01:30:38 np0005539510 podman[133386]: 2025-11-29 06:30:38.887306839 +0000 UTC m=+0.132132740 container init 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 01:30:38 np0005539510 ovn_controller[133401]: + sudo -E kolla_set_configs
Nov 29 01:30:38 np0005539510 podman[133386]: 2025-11-29 06:30:38.920162756 +0000 UTC m=+0.164988657 container start 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 01:30:38 np0005539510 edpm-start-podman-container[133386]: ovn_controller
Nov 29 01:30:38 np0005539510 systemd[1]: Created slice User Slice of UID 0.
Nov 29 01:30:38 np0005539510 systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 29 01:30:38 np0005539510 systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 29 01:30:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:38.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:38 np0005539510 systemd[1]: Starting User Manager for UID 0...
Nov 29 01:30:38 np0005539510 edpm-start-podman-container[133385]: Creating additional drop-in dependency for "ovn_controller" (4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947)
Nov 29 01:30:38 np0005539510 podman[133408]: 2025-11-29 06:30:38.999155259 +0000 UTC m=+0.069355284 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=ovn_controller)
Nov 29 01:30:39 np0005539510 systemd[1]: 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947-2c86bc44c568e13f.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 01:30:39 np0005539510 systemd[1]: 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947-2c86bc44c568e13f.service: Failed with result 'exit-code'.
Nov 29 01:30:39 np0005539510 systemd[1]: Reloading.
Nov 29 01:30:39 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:30:39 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:30:39 np0005539510 systemd[133441]: Queued start job for default target Main User Target.
Nov 29 01:30:39 np0005539510 systemd[133441]: Created slice User Application Slice.
Nov 29 01:30:39 np0005539510 systemd[133441]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 29 01:30:39 np0005539510 systemd[133441]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 01:30:39 np0005539510 systemd[133441]: Reached target Paths.
Nov 29 01:30:39 np0005539510 systemd[133441]: Reached target Timers.
Nov 29 01:30:39 np0005539510 systemd[133441]: Starting D-Bus User Message Bus Socket...
Nov 29 01:30:39 np0005539510 systemd[133441]: Starting Create User's Volatile Files and Directories...
Nov 29 01:30:39 np0005539510 systemd[133441]: Listening on D-Bus User Message Bus Socket.
Nov 29 01:30:39 np0005539510 systemd[133441]: Reached target Sockets.
Nov 29 01:30:39 np0005539510 systemd[133441]: Finished Create User's Volatile Files and Directories.
Nov 29 01:30:39 np0005539510 systemd[133441]: Reached target Basic System.
Nov 29 01:30:39 np0005539510 systemd[133441]: Reached target Main User Target.
Nov 29 01:30:39 np0005539510 systemd[133441]: Startup finished in 151ms.
Nov 29 01:30:39 np0005539510 systemd[1]: Started User Manager for UID 0.
Nov 29 01:30:39 np0005539510 systemd[1]: Started ovn_controller container.
Nov 29 01:30:39 np0005539510 systemd[1]: Started Session c1 of User root.
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: INFO:__main__:Validating config file
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: INFO:__main__:Writing out command to execute
Nov 29 01:30:39 np0005539510 systemd[1]: session-c1.scope: Deactivated successfully.
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: ++ cat /run_command
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: + ARGS=
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: + sudo kolla_copy_cacerts
Nov 29 01:30:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:39.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:39 np0005539510 systemd[1]: Started Session c2 of User root.
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: + [[ ! -n '' ]]
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: + . kolla_extend_start
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: + umask 0022
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Nov 29 01:30:39 np0005539510 systemd[1]: session-c2.scope: Deactivated successfully.
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: 2025-11-29T06:30:39Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: 2025-11-29T06:30:39Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: 2025-11-29T06:30:39Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: 2025-11-29T06:30:39Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: 2025-11-29T06:30:39Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: 2025-11-29T06:30:39Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Nov 29 01:30:39 np0005539510 NetworkManager[48989]: <info>  [1764397839.5169] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Nov 29 01:30:39 np0005539510 NetworkManager[48989]: <info>  [1764397839.5176] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:30:39 np0005539510 NetworkManager[48989]: <info>  [1764397839.5186] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Nov 29 01:30:39 np0005539510 NetworkManager[48989]: <info>  [1764397839.5190] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Nov 29 01:30:39 np0005539510 NetworkManager[48989]: <info>  [1764397839.5193] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 29 01:30:39 np0005539510 kernel: br-int: entered promiscuous mode
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: 2025-11-29T06:30:39Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: 2025-11-29T06:30:39Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: 2025-11-29T06:30:39Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: 2025-11-29T06:30:39Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: 2025-11-29T06:30:39Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: 2025-11-29T06:30:39Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: 2025-11-29T06:30:39Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: 2025-11-29T06:30:39Z|00014|main|INFO|OVS feature set changed, force recompute.
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: 2025-11-29T06:30:39Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: 2025-11-29T06:30:39Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: 2025-11-29T06:30:39Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: 2025-11-29T06:30:39Z|00018|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: 2025-11-29T06:30:39Z|00019|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: 2025-11-29T06:30:39Z|00020|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: 2025-11-29T06:30:39Z|00021|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: 2025-11-29T06:30:39Z|00022|main|INFO|OVS feature set changed, force recompute.
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: 2025-11-29T06:30:39Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: 2025-11-29T06:30:39Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: 2025-11-29T06:30:39Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: 2025-11-29T06:30:39Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: 2025-11-29T06:30:39Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: 2025-11-29T06:30:39Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: 2025-11-29T06:30:39Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 29 01:30:39 np0005539510 ovn_controller[133401]: 2025-11-29T06:30:39Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 29 01:30:39 np0005539510 NetworkManager[48989]: <info>  [1764397839.5398] manager: (ovn-2fa832-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Nov 29 01:30:39 np0005539510 NetworkManager[48989]: <info>  [1764397839.5405] manager: (ovn-e15f55-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Nov 29 01:30:39 np0005539510 systemd-udevd[133535]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:30:39 np0005539510 kernel: genev_sys_6081: entered promiscuous mode
Nov 29 01:30:39 np0005539510 NetworkManager[48989]: <info>  [1764397839.5556] device (genev_sys_6081): carrier: link connected
Nov 29 01:30:39 np0005539510 NetworkManager[48989]: <info>  [1764397839.5559] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/21)
Nov 29 01:30:39 np0005539510 systemd-udevd[133537]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:30:40 np0005539510 NetworkManager[48989]: <info>  [1764397840.0798] manager: (ovn-93db78-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Nov 29 01:30:40 np0005539510 python3.9[133668]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:30:40 np0005539510 ovs-vsctl[133669]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Nov 29 01:30:40 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:30:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:30:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:40.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:30:41 np0005539510 python3.9[133821]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:30:41 np0005539510 ovs-vsctl[133823]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Nov 29 01:30:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:30:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:41.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:30:42 np0005539510 python3.9[133977]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:30:42 np0005539510 ovs-vsctl[133978]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Nov 29 01:30:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:42.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:43 np0005539510 systemd-logind[784]: Session 45 logged out. Waiting for processes to exit.
Nov 29 01:30:43 np0005539510 systemd[1]: session-45.scope: Deactivated successfully.
Nov 29 01:30:43 np0005539510 systemd[1]: session-45.scope: Consumed 57.493s CPU time.
Nov 29 01:30:43 np0005539510 systemd-logind[784]: Removed session 45.
Nov 29 01:30:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:43.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:44.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:45.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:45 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:30:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:46.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:47.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:48 np0005539510 systemd-logind[784]: New session 47 of user zuul.
Nov 29 01:30:48 np0005539510 systemd[1]: Started Session 47 of User zuul.
Nov 29 01:30:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:48.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:49.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:49 np0005539510 systemd[1]: Stopping User Manager for UID 0...
Nov 29 01:30:49 np0005539510 systemd[133441]: Activating special unit Exit the Session...
Nov 29 01:30:49 np0005539510 systemd[133441]: Stopped target Main User Target.
Nov 29 01:30:49 np0005539510 systemd[133441]: Stopped target Basic System.
Nov 29 01:30:49 np0005539510 systemd[133441]: Stopped target Paths.
Nov 29 01:30:49 np0005539510 systemd[133441]: Stopped target Sockets.
Nov 29 01:30:49 np0005539510 systemd[133441]: Stopped target Timers.
Nov 29 01:30:49 np0005539510 systemd[133441]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 29 01:30:49 np0005539510 systemd[133441]: Closed D-Bus User Message Bus Socket.
Nov 29 01:30:49 np0005539510 systemd[133441]: Stopped Create User's Volatile Files and Directories.
Nov 29 01:30:49 np0005539510 systemd[133441]: Removed slice User Application Slice.
Nov 29 01:30:49 np0005539510 systemd[133441]: Reached target Shutdown.
Nov 29 01:30:49 np0005539510 systemd[133441]: Finished Exit the Session.
Nov 29 01:30:49 np0005539510 systemd[133441]: Reached target Exit the Session.
Nov 29 01:30:49 np0005539510 systemd[1]: user@0.service: Deactivated successfully.
Nov 29 01:30:49 np0005539510 systemd[1]: Stopped User Manager for UID 0.
Nov 29 01:30:49 np0005539510 systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 29 01:30:49 np0005539510 systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 29 01:30:49 np0005539510 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 29 01:30:49 np0005539510 systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 29 01:30:49 np0005539510 systemd[1]: Removed slice User Slice of UID 0.
Nov 29 01:30:49 np0005539510 python3.9[134211]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:30:50 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:30:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:50.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:30:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:51.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:30:52 np0005539510 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 01:30:52 np0005539510 python3.9[134376]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:30:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:52.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:53.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:53 np0005539510 python3.9[134532]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:30:54 np0005539510 python3.9[134684]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:30:54 np0005539510 python3.9[134839]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:30:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:54.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:55.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:55 np0005539510 python3.9[134991]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:30:55 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:30:56 np0005539510 python3.9[135143]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:30:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:56.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:30:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:57.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:30:57 np0005539510 python3.9[135296]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 29 01:30:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:30:58.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:59 np0005539510 python3.9[135452]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:30:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:30:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:30:59.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:30:59 np0005539510 python3.9[135573]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397858.4530733-225-200257258328877/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:00 np0005539510 python3.9[135725]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:00 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:31:00 np0005539510 python3.9[135847]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397859.9660137-270-18791181791034/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:01.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:01.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:02 np0005539510 python3.9[135999]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:31:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:03.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:03 np0005539510 python3.9[136086]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:31:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:03.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:31:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:05.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:31:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:05.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:05 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:31:05 np0005539510 python3.9[136294]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:31:06 np0005539510 python3.9[136450]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:07.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:07 np0005539510 python3.9[136571]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397866.1569865-381-61470608382686/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:07.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:07 np0005539510 python3.9[136875]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:07 np0005539510 podman[136896]: 2025-11-29 06:31:07.851468329 +0000 UTC m=+0.092945522 container exec 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 01:31:07 np0005539510 podman[136896]: 2025-11-29 06:31:07.98111456 +0000 UTC m=+0.222591693 container exec_died 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 29 01:31:08 np0005539510 python3.9[137089]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397867.2855337-381-110600878901683/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:08 np0005539510 podman[137199]: 2025-11-29 06:31:08.570419304 +0000 UTC m=+0.068645665 container exec e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 01:31:08 np0005539510 podman[137199]: 2025-11-29 06:31:08.608235196 +0000 UTC m=+0.106461527 container exec_died e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 01:31:08 np0005539510 podman[137266]: 2025-11-29 06:31:08.878303069 +0000 UTC m=+0.077152474 container exec d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, build-date=2023-02-22T09:23:20, vcs-type=git, description=keepalived for Ceph, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, architecture=x86_64, com.redhat.component=keepalived-container, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 01:31:08 np0005539510 podman[137266]: 2025-11-29 06:31:08.899199724 +0000 UTC m=+0.098049089 container exec_died d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, build-date=2023-02-22T09:23:20, release=1793, vendor=Red Hat, Inc., version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64)
Nov 29 01:31:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:31:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:09.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:31:09 np0005539510 ovn_controller[133401]: 2025-11-29T06:31:09Z|00025|memory|INFO|16128 kB peak resident set size after 29.9 seconds
Nov 29 01:31:09 np0005539510 ovn_controller[133401]: 2025-11-29T06:31:09Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:642 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Nov 29 01:31:09 np0005539510 podman[137324]: 2025-11-29 06:31:09.384429808 +0000 UTC m=+0.115711526 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 01:31:09 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:31:09 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:31:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:09.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:10 np0005539510 python3.9[137570]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:10 np0005539510 python3.9[137704]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397869.5004444-513-211110170257389/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:10 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:31:10 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Nov 29 01:31:10 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:31:10 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:31:10 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:31:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:11.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:11 np0005539510 python3.9[137856]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:11.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:11 np0005539510 python3.9[137977]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397870.8714752-513-155870918028675/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:12 np0005539510 python3.9[138130]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:31:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:13.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:13.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:13 np0005539510 python3.9[138284]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:14 np0005539510 python3.9[138438]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:14 np0005539510 python3.9[138517]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:15.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:15.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:15 np0005539510 python3.9[138669]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:15 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:31:16 np0005539510 python3.9[138749]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:17 np0005539510 python3.9[138902]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:31:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:17.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:17.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:17 np0005539510 python3.9[139054]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:18 np0005539510 python3.9[139134]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:31:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:19.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:19.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:19 np0005539510 python3.9[139289]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:19 np0005539510 python3.9[139367]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:31:20 np0005539510 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 01:31:20 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:31:21 np0005539510 python3.9[139522]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:31:21 np0005539510 systemd[1]: Reloading.
Nov 29 01:31:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:31:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:21.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:31:21 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:31:21 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:31:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:21.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:22 np0005539510 python3.9[139714]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:22 np0005539510 python3.9[139793]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:31:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:23.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:31:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:23.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:31:23 np0005539510 python3.9[139947]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:24 np0005539510 python3.9[140075]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:31:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:31:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:25.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:31:25 np0005539510 python3.9[140230]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:31:25 np0005539510 systemd[1]: Reloading.
Nov 29 01:31:25 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:31:25 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:31:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:25.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:25 np0005539510 systemd[1]: Starting Create netns directory...
Nov 29 01:31:25 np0005539510 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 01:31:25 np0005539510 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 01:31:25 np0005539510 systemd[1]: Finished Create netns directory.
Nov 29 01:31:25 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:31:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:27.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:27 np0005539510 python3.9[140428]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:27.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:27 np0005539510 python3.9[140582]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:28 np0005539510 python3.9[140705]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397887.4240518-967-277431065694983/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:31:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:29.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:31:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:31:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:29.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:31:29 np0005539510 python3.9[140860]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:30 np0005539510 python3.9[141014]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:30 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:31:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:31.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:31 np0005539510 python3.9[141140]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397890.0096362-1041-170456456243359/.source.json _original_basename=.3p2y0425 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.203539) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397891203582, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 1375, "num_deletes": 252, "total_data_size": 3197559, "memory_usage": 3240736, "flush_reason": "Manual Compaction"}
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397891215348, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 2087601, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 9261, "largest_seqno": 10631, "table_properties": {"data_size": 2081808, "index_size": 3124, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12444, "raw_average_key_size": 19, "raw_value_size": 2069865, "raw_average_value_size": 3254, "num_data_blocks": 144, "num_entries": 636, "num_filter_entries": 636, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397736, "oldest_key_time": 1764397736, "file_creation_time": 1764397891, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 11845 microseconds, and 4645 cpu microseconds.
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.215386) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 2087601 bytes OK
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.215404) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.217213) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.217230) EVENT_LOG_v1 {"time_micros": 1764397891217225, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.217252) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 3191162, prev total WAL file size 3191162, number of live WAL files 2.
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.217928) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(2038KB)], [18(9227KB)]
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397891217958, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 11536634, "oldest_snapshot_seqno": -1}
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 4001 keys, 9547217 bytes, temperature: kUnknown
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397891271995, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 9547217, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9515741, "index_size": 20358, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10053, "raw_key_size": 98601, "raw_average_key_size": 24, "raw_value_size": 9438548, "raw_average_value_size": 2359, "num_data_blocks": 889, "num_entries": 4001, "num_filter_entries": 4001, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397158, "oldest_key_time": 0, "file_creation_time": 1764397891, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.272293) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 9547217 bytes
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.273361) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 213.2 rd, 176.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 9.0 +0.0 blob) out(9.1 +0.0 blob), read-write-amplify(10.1) write-amplify(4.6) OK, records in: 4521, records dropped: 520 output_compression: NoCompression
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.273382) EVENT_LOG_v1 {"time_micros": 1764397891273371, "job": 8, "event": "compaction_finished", "compaction_time_micros": 54121, "compaction_time_cpu_micros": 20949, "output_level": 6, "num_output_files": 1, "total_output_size": 9547217, "num_input_records": 4521, "num_output_records": 4001, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397891273856, "job": 8, "event": "table_file_deletion", "file_number": 20}
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764397891275503, "job": 8, "event": "table_file_deletion", "file_number": 18}
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.217885) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.275648) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.275655) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.275656) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.275657) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:31:31 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:31:31.275659) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:31:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:31.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:32 np0005539510 python3.9[141292]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:31:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:33.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:33.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:35.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:35 np0005539510 python3.9[141727]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Nov 29 01:31:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:35.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:35 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:31:36 np0005539510 python3.9[141879]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 01:31:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:37.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:37.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:38 np0005539510 python3.9[142032]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 29 01:31:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:39.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:39.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:39 np0005539510 podman[142184]: 2025-11-29 06:31:39.94158813 +0000 UTC m=+0.103599538 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 01:31:40 np0005539510 python3[142228]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 01:31:40 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:31:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:31:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:41.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:31:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:31:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:41.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:31:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:43.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:31:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:43.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:31:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:45.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:45.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:45 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:31:46 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:31:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:47.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:47.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:49.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:49.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:50 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:31:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:51.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:51 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:31:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:51 np0005539510 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:31:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:51.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:51 np0005539510 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:31:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:53.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:53 np0005539510 podman[142252]: 2025-11-29 06:31:53.466558882 +0000 UTC m=+13.216320837 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:31:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:53.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:53 np0005539510 podman[142479]: 2025-11-29 06:31:53.630403229 +0000 UTC m=+0.052197480 container create b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:31:53 np0005539510 podman[142479]: 2025-11-29 06:31:53.599014432 +0000 UTC m=+0.020808713 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:31:53 np0005539510 python3[142228]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:31:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:55.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:55.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:55 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:31:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:57.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:57.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:31:59.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:31:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:31:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:31:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:31:59.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:00 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:32:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:01.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:01.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:32:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:03.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:32:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:03.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:32:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:05.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:32:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:32:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:05.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:32:05 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:32:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:07.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:32:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:07.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:32:08 np0005539510 python3.9[142727]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:32:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:09.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:32:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:09.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:32:09 np0005539510 python3.9[142881]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:10 np0005539510 podman[142957]: 2025-11-29 06:32:10.115063383 +0000 UTC m=+0.124357398 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 01:32:10 np0005539510 python3.9[142958]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:32:10 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:32:10 np0005539510 python3.9[143135]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764397930.265606-1305-245620988595391/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:11.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:11 np0005539510 python3.9[143211]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:32:11 np0005539510 systemd[1]: Reloading.
Nov 29 01:32:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:11.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:11 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:32:11 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:32:12 np0005539510 python3.9[143323]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:12 np0005539510 systemd[1]: Reloading.
Nov 29 01:32:12 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:32:12 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:32:12 np0005539510 systemd[1]: Starting ovn_metadata_agent container...
Nov 29 01:32:12 np0005539510 systemd[1]: Started libcrun container.
Nov 29 01:32:12 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e600fbe3e5682b63fdf6b9076df48890b4fc9515c63768d1ddebb9c35046ec3/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 29 01:32:12 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e600fbe3e5682b63fdf6b9076df48890b4fc9515c63768d1ddebb9c35046ec3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:32:12 np0005539510 systemd[1]: Started /usr/bin/podman healthcheck run b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108.
Nov 29 01:32:12 np0005539510 podman[143365]: 2025-11-29 06:32:12.911425851 +0000 UTC m=+0.131028946 container init b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Nov 29 01:32:12 np0005539510 ovn_metadata_agent[143380]: + sudo -E kolla_set_configs
Nov 29 01:32:12 np0005539510 podman[143365]: 2025-11-29 06:32:12.957835664 +0000 UTC m=+0.177438759 container start b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:32:12 np0005539510 edpm-start-podman-container[143365]: ovn_metadata_agent
Nov 29 01:32:13 np0005539510 edpm-start-podman-container[143364]: Creating additional drop-in dependency for "ovn_metadata_agent" (b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108)
Nov 29 01:32:13 np0005539510 podman[143386]: 2025-11-29 06:32:13.033413998 +0000 UTC m=+0.061694537 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:32:13 np0005539510 systemd[1]: Reloading.
Nov 29 01:32:13 np0005539510 ovn_metadata_agent[143380]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 01:32:13 np0005539510 ovn_metadata_agent[143380]: INFO:__main__:Validating config file
Nov 29 01:32:13 np0005539510 ovn_metadata_agent[143380]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 01:32:13 np0005539510 ovn_metadata_agent[143380]: INFO:__main__:Copying service configuration files
Nov 29 01:32:13 np0005539510 ovn_metadata_agent[143380]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 29 01:32:13 np0005539510 ovn_metadata_agent[143380]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 29 01:32:13 np0005539510 ovn_metadata_agent[143380]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 29 01:32:13 np0005539510 ovn_metadata_agent[143380]: INFO:__main__:Writing out command to execute
Nov 29 01:32:13 np0005539510 ovn_metadata_agent[143380]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 29 01:32:13 np0005539510 ovn_metadata_agent[143380]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 29 01:32:13 np0005539510 ovn_metadata_agent[143380]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 29 01:32:13 np0005539510 ovn_metadata_agent[143380]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 29 01:32:13 np0005539510 ovn_metadata_agent[143380]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 29 01:32:13 np0005539510 ovn_metadata_agent[143380]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 29 01:32:13 np0005539510 ovn_metadata_agent[143380]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 29 01:32:13 np0005539510 ovn_metadata_agent[143380]: ++ cat /run_command
Nov 29 01:32:13 np0005539510 ovn_metadata_agent[143380]: + CMD=neutron-ovn-metadata-agent
Nov 29 01:32:13 np0005539510 ovn_metadata_agent[143380]: + ARGS=
Nov 29 01:32:13 np0005539510 ovn_metadata_agent[143380]: + sudo kolla_copy_cacerts
Nov 29 01:32:13 np0005539510 ovn_metadata_agent[143380]: + [[ ! -n '' ]]
Nov 29 01:32:13 np0005539510 ovn_metadata_agent[143380]: + . kolla_extend_start
Nov 29 01:32:13 np0005539510 ovn_metadata_agent[143380]: Running command: 'neutron-ovn-metadata-agent'
Nov 29 01:32:13 np0005539510 ovn_metadata_agent[143380]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Nov 29 01:32:13 np0005539510 ovn_metadata_agent[143380]: + umask 0022
Nov 29 01:32:13 np0005539510 ovn_metadata_agent[143380]: + exec neutron-ovn-metadata-agent
Nov 29 01:32:13 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:32:13 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:32:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:32:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:13.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:32:13 np0005539510 systemd[1]: Started ovn_metadata_agent container.
Nov 29 01:32:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:32:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:13.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.085 143385 INFO neutron.common.config [-] Logging enabled!#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.086 143385 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.086 143385 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.087 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.087 143385 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.087 143385 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.087 143385 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.087 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.087 143385 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.087 143385 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.087 143385 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.088 143385 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.088 143385 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.088 143385 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.088 143385 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.088 143385 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.088 143385 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.088 143385 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.088 143385 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.088 143385 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.089 143385 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.089 143385 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.089 143385 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.089 143385 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.089 143385 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.089 143385 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.089 143385 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.089 143385 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.089 143385 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.090 143385 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.090 143385 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.090 143385 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.090 143385 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.090 143385 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.090 143385 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.090 143385 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.090 143385 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.090 143385 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.091 143385 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.091 143385 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.091 143385 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.091 143385 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.091 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.091 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.091 143385 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.091 143385 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.091 143385 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.091 143385 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.092 143385 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.092 143385 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.092 143385 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.092 143385 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.092 143385 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.092 143385 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.092 143385 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.092 143385 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.092 143385 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.092 143385 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.093 143385 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.093 143385 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.093 143385 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.093 143385 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.093 143385 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.093 143385 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.093 143385 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.093 143385 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.093 143385 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.094 143385 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.094 143385 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.094 143385 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.094 143385 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.094 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.094 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.094 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.094 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.094 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.095 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.095 143385 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.095 143385 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.095 143385 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.095 143385 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.095 143385 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.095 143385 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.095 143385 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.095 143385 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.095 143385 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.096 143385 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.096 143385 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.096 143385 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.096 143385 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.096 143385 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.096 143385 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.096 143385 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.096 143385 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.096 143385 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.097 143385 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.097 143385 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.097 143385 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.097 143385 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.097 143385 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.097 143385 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.097 143385 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.097 143385 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.097 143385 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.097 143385 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.098 143385 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.098 143385 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.098 143385 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.098 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.098 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.098 143385 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.098 143385 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.099 143385 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.099 143385 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.099 143385 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.099 143385 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.099 143385 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.099 143385 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.099 143385 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.100 143385 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.100 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.100 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.100 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.100 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.100 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.100 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.100 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.101 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.101 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.101 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.101 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.101 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.101 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.101 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.101 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.102 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.102 143385 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.102 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.102 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.102 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.102 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.102 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.102 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.102 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.103 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.103 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.103 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.103 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.103 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.103 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.103 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.103 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.103 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.104 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.104 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.104 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.104 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.104 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.104 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.104 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.104 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.104 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.104 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.105 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.105 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.105 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.105 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.105 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.105 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.105 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.105 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.105 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.106 143385 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.106 143385 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.106 143385 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.106 143385 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.106 143385 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.106 143385 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.106 143385 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.106 143385 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.106 143385 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.107 143385 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.107 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.107 143385 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.107 143385 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.107 143385 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.107 143385 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.107 143385 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.107 143385 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.107 143385 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.108 143385 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.108 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.108 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.108 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.108 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.108 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.108 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.108 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.109 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.109 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.109 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.109 143385 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.109 143385 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.109 143385 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.109 143385 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.109 143385 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.109 143385 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.110 143385 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.110 143385 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.110 143385 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.110 143385 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.110 143385 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.110 143385 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.110 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.110 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.110 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.110 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.111 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.111 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.111 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.111 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.111 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.111 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.111 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.111 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.111 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.112 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.112 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.112 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.112 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.112 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.112 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.112 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.112 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.112 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.112 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.113 143385 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.113 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.113 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.113 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.113 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.113 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.113 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.113 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.114 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.114 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.114 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.114 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.114 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.114 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.114 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.114 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.114 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.114 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.115 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.115 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.115 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.115 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.115 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.115 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.115 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.115 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.115 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.116 143385 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.116 143385 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.116 143385 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.116 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.116 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.116 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.116 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.116 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.116 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.117 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.117 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.117 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.117 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.117 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.117 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.117 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.117 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.117 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.118 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.118 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.118 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.118 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.118 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.118 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.118 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.118 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.118 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.119 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.119 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.119 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.119 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.119 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.119 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.119 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.119 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.119 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.119 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.120 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.120 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.120 143385 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.120 143385 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.129 143385 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.129 143385 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.129 143385 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.129 143385 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.129 143385 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.143 143385 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name fa6f2e5a-176a-4b37-8b2a-5aaf74119c47 (UUID: fa6f2e5a-176a-4b37-8b2a-5aaf74119c47) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Nov 29 01:32:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:15.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.170 143385 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.170 143385 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.170 143385 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.170 143385 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.174 143385 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.202 143385 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.300 143385 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'fa6f2e5a-176a-4b37-8b2a-5aaf74119c47'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f7a2ef86760>], external_ids={}, name=fa6f2e5a-176a-4b37-8b2a-5aaf74119c47, nb_cfg_timestamp=1764397847534, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.302 143385 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f7a2ef75f70>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.303 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.303 143385 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.304 143385 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.304 143385 INFO oslo_service.service [-] Starting 1 workers#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.311 143385 DEBUG oslo_service.service [-] Started child 143492 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.315 143492 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-2000351'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.318 143385 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpfwloyrd0/privsep.sock']#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.361 143492 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.361 143492 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.361 143492 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.366 143492 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.371 143492 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Nov 29 01:32:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.377 143492 INFO eventlet.wsgi.server [-] (143492) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Nov 29 01:32:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:15.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:15 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:32:15 np0005539510 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Nov 29 01:32:16 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:16.057 143385 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Nov 29 01:32:16 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:16.058 143385 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpfwloyrd0/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Nov 29 01:32:16 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.892 143497 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 29 01:32:16 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.898 143497 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 29 01:32:16 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.906 143497 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Nov 29 01:32:16 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:15.907 143497 INFO oslo.privsep.daemon [-] privsep daemon running as pid 143497#033[00m
Nov 29 01:32:16 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:16.061 143497 DEBUG oslo.privsep.daemon [-] privsep: reply[fb6302d5-250e-431f-83c0-e4e736a6dceb]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:32:16 np0005539510 systemd[1]: session-47.scope: Deactivated successfully.
Nov 29 01:32:16 np0005539510 systemd[1]: session-47.scope: Consumed 58.749s CPU time.
Nov 29 01:32:16 np0005539510 systemd-logind[784]: Session 47 logged out. Waiting for processes to exit.
Nov 29 01:32:16 np0005539510 systemd-logind[784]: Removed session 47.
Nov 29 01:32:16 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:16.626 143497 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:32:16 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:16.626 143497 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:32:16 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:16.626 143497 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:32:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:17.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.190 143497 DEBUG oslo.privsep.daemon [-] privsep: reply[c0429e42-80b0-4f46-a43b-31b344e33fec]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.194 143385 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=fa6f2e5a-176a-4b37-8b2a-5aaf74119c47, column=external_ids, values=({'neutron:ovn-metadata-id': '0fe2b91c-e971-5566-838d-5b85755822a8'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.203 143385 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=fa6f2e5a-176a-4b37-8b2a-5aaf74119c47, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.209 143385 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.210 143385 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.210 143385 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.210 143385 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.210 143385 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.210 143385 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.210 143385 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.210 143385 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.210 143385 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.210 143385 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.211 143385 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.211 143385 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.211 143385 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.211 143385 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.211 143385 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.211 143385 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.211 143385 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.212 143385 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.212 143385 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.212 143385 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.212 143385 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.212 143385 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.212 143385 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.212 143385 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.212 143385 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.213 143385 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.213 143385 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.213 143385 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.213 143385 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.213 143385 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.213 143385 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.213 143385 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.213 143385 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.214 143385 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.214 143385 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.214 143385 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.214 143385 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.214 143385 DEBUG oslo_service.service [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.214 143385 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.214 143385 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.214 143385 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.215 143385 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.215 143385 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.215 143385 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.215 143385 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.215 143385 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.215 143385 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.215 143385 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.215 143385 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.216 143385 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.216 143385 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.216 143385 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.216 143385 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.216 143385 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.216 143385 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.216 143385 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.216 143385 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.217 143385 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.217 143385 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.217 143385 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.217 143385 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.217 143385 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.217 143385 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.217 143385 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.217 143385 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.218 143385 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.218 143385 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.218 143385 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.218 143385 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.218 143385 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.218 143385 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.218 143385 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.218 143385 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.218 143385 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.218 143385 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.219 143385 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.219 143385 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.219 143385 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.219 143385 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.219 143385 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.219 143385 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.219 143385 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.219 143385 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.219 143385 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.220 143385 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.220 143385 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.220 143385 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.220 143385 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.220 143385 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.220 143385 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.220 143385 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.220 143385 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.220 143385 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.221 143385 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.221 143385 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.221 143385 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.221 143385 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.221 143385 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.221 143385 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.221 143385 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.221 143385 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.221 143385 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.221 143385 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.222 143385 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.222 143385 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.222 143385 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.222 143385 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.222 143385 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.222 143385 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.222 143385 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.223 143385 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.223 143385 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.223 143385 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.223 143385 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.223 143385 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.223 143385 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.223 143385 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.223 143385 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.223 143385 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.224 143385 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.224 143385 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.224 143385 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.224 143385 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.224 143385 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.224 143385 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.224 143385 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.224 143385 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.225 143385 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.225 143385 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.225 143385 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.225 143385 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.225 143385 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.225 143385 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.225 143385 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.225 143385 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.226 143385 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.226 143385 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.226 143385 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.226 143385 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.226 143385 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.226 143385 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.226 143385 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.226 143385 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.227 143385 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.227 143385 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.227 143385 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.227 143385 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.227 143385 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.227 143385 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.227 143385 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.228 143385 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.228 143385 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.228 143385 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.228 143385 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.228 143385 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.228 143385 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.228 143385 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.228 143385 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.228 143385 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.229 143385 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.229 143385 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.229 143385 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.229 143385 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.229 143385 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.229 143385 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.229 143385 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.230 143385 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.230 143385 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.230 143385 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.230 143385 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.230 143385 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.230 143385 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.230 143385 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.231 143385 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.231 143385 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.231 143385 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.231 143385 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.231 143385 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.231 143385 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.231 143385 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.232 143385 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.232 143385 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.232 143385 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.232 143385 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.232 143385 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.232 143385 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.232 143385 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.233 143385 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.233 143385 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.233 143385 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.233 143385 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.233 143385 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.233 143385 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.233 143385 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.233 143385 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.234 143385 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.234 143385 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.234 143385 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.234 143385 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.234 143385 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.234 143385 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.234 143385 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.234 143385 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.234 143385 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.235 143385 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.235 143385 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.235 143385 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.235 143385 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.235 143385 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.235 143385 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.235 143385 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.235 143385 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.235 143385 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.236 143385 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.236 143385 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.236 143385 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.236 143385 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.236 143385 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.236 143385 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.236 143385 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.237 143385 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.237 143385 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.237 143385 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.237 143385 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.237 143385 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.237 143385 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.237 143385 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.237 143385 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.238 143385 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.238 143385 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.238 143385 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.238 143385 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.238 143385 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.238 143385 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.238 143385 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.239 143385 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.239 143385 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.239 143385 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.239 143385 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.239 143385 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.239 143385 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.239 143385 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.240 143385 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.240 143385 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.240 143385 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.240 143385 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.240 143385 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.240 143385 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.240 143385 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.240 143385 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.241 143385 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.241 143385 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.241 143385 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.241 143385 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.241 143385 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.241 143385 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.241 143385 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.241 143385 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.242 143385 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.242 143385 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.242 143385 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.242 143385 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.242 143385 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.242 143385 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.242 143385 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.243 143385 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.243 143385 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.243 143385 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.243 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.243 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.243 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.243 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.244 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.244 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.244 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.244 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.244 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.244 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.245 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.245 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.245 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.245 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.245 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.245 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.246 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.246 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.246 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.246 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.246 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.246 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.246 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.247 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.247 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.247 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.247 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.247 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.247 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.247 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.248 143385 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.248 143385 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.248 143385 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.248 143385 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.248 143385 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:17 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:32:17.248 143385 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 29 01:32:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:17.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:19.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:19.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:20 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:32:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.004000100s ======
Nov 29 01:32:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:21.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000100s
Nov 29 01:32:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:21.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:22 np0005539510 systemd-logind[784]: New session 48 of user zuul.
Nov 29 01:32:22 np0005539510 systemd[1]: Started Session 48 of User zuul.
Nov 29 01:32:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:23.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:23 np0005539510 python3.9[143681]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:32:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:23.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:24 np0005539510 python3.9[143838]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:32:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:25.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:32:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:25.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:32:25 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:32:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:27.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:27.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:28 np0005539510 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 01:32:29 np0005539510 python3.9[144054]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:32:29 np0005539510 systemd[1]: Reloading.
Nov 29 01:32:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:29.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:29 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:32:29 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:32:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:29.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:29 np0005539510 ceph-mon[77142]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Nov 29 01:32:29 np0005539510 ceph-mon[77142]: paxos.1).electionLogic(23) init, last seen epoch 23, mid-election, bumping
Nov 29 01:32:29 np0005539510 ceph-mon[77142]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:32:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:31.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:31.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:32 np0005539510 python3.9[144241]: ansible-ansible.builtin.service_facts Invoked
Nov 29 01:32:32 np0005539510 network[144258]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:32:32 np0005539510 network[144259]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:32:32 np0005539510 network[144260]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:32:32 np0005539510 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 01:32:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:33.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:33.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:34 np0005539510 ceph-mon[77142]: log_channel(cluster) log [INF] : mon.compute-2 is new leader, mons compute-2,compute-1 in quorum (ranks 1,2)
Nov 29 01:32:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:32:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:35.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:32:35 np0005539510 ceph-mon[77142]: log_channel(cluster) log [INF] : overall HEALTH_OK
Nov 29 01:32:35 np0005539510 ceph-mon[77142]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Nov 29 01:32:35 np0005539510 ceph-mon[77142]: paxos.1).electionLogic(26) init, last seen epoch 26
Nov 29 01:32:35 np0005539510 ceph-mon[77142]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:32:35 np0005539510 ceph-mon[77142]: mon.compute-2@1(electing) e3 handle_timecheck drop unexpected msg
Nov 29 01:32:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:35.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:35 np0005539510 ceph-mon[77142]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:32:36 np0005539510 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 01:32:37 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 29 01:32:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:37.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:37 np0005539510 python3.9[144525]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:37.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:37 np0005539510 python3.9[144678]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:38 np0005539510 ceph-mon[77142]: mon.compute-1 calling monitor election
Nov 29 01:32:38 np0005539510 ceph-mon[77142]: mon.compute-2 calling monitor election
Nov 29 01:32:38 np0005539510 ceph-mon[77142]: mon.compute-2 is new leader, mons compute-2,compute-1 in quorum (ranks 1,2)
Nov 29 01:32:38 np0005539510 ceph-mon[77142]: mon.compute-0 calling monitor election
Nov 29 01:32:38 np0005539510 ceph-mon[77142]: overall HEALTH_OK
Nov 29 01:32:38 np0005539510 ceph-mon[77142]: mon.compute-2 calling monitor election
Nov 29 01:32:38 np0005539510 ceph-mon[77142]: mon.compute-1 calling monitor election
Nov 29 01:32:38 np0005539510 ceph-mon[77142]: mon.compute-0 calling monitor election
Nov 29 01:32:38 np0005539510 ceph-mon[77142]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Nov 29 01:32:38 np0005539510 ceph-mon[77142]: overall HEALTH_OK
Nov 29 01:32:38 np0005539510 python3.9[144832]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:32:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:39.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:32:39 np0005539510 python3.9[144985]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:39.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:40 np0005539510 python3.9[145138]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:40 np0005539510 podman[145140]: 2025-11-29 06:32:40.476794525 +0000 UTC m=+0.109759728 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:32:40 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:32:41 np0005539510 python3.9[145318]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:41.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:41.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:41 np0005539510 python3.9[145471]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:43.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:43.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:43 np0005539510 podman[145498]: 2025-11-29 06:32:43.921915938 +0000 UTC m=+0.069617330 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 29 01:32:44 np0005539510 python3.9[145644]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:45 np0005539510 python3.9[145797]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:45.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:45.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:45 np0005539510 python3.9[145949]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:46 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:32:46 np0005539510 python3.9[146175]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:47.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:47 np0005539510 python3.9[146435]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:47.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:47 np0005539510 python3.9[146587]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:48 np0005539510 python3.9[146739]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:49.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:49.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:51 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:32:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:51.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:51.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:52 np0005539510 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 01:32:52 np0005539510 python3.9[146893]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:53 np0005539510 python3.9[147046]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:53.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:53.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:53 np0005539510 python3.9[147198]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:54 np0005539510 python3.9[147350]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:55 np0005539510 python3.9[147503]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:55.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:55.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:55 np0005539510 python3.9[147655]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:56 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:32:56 np0005539510 python3.9[147807]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:57.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:57.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:57 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:32:57 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:32:57 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:32:58 np0005539510 python3.9[147961]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:32:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:32:59.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:32:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:32:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:32:59.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:32:59 np0005539510 python3.9[148113]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 01:33:01 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:33:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:33:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:01.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:33:01 np0005539510 python3.9[148268]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:33:01 np0005539510 systemd[1]: Reloading.
Nov 29 01:33:01 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:33:01 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:33:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:01.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:02 np0005539510 python3.9[148456]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:33:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:03.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:03 np0005539510 python3.9[148609]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:33:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:33:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:03.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:33:04 np0005539510 python3.9[148762]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:33:04 np0005539510 python3.9[148916]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:33:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:05.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:05 np0005539510 python3.9[149069]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:33:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:05.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:06 np0005539510 python3.9[149222]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:33:06 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:33:06 np0005539510 python3.9[149426]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:33:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:33:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:07.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:33:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:07.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:09.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:09 np0005539510 python3.9[149580]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Nov 29 01:33:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:33:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:09.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:33:10 np0005539510 python3.9[149733]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 01:33:10 np0005539510 podman[149765]: 2025-11-29 06:33:10.941697039 +0000 UTC m=+0.095433833 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 01:33:11 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:33:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:11.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:11 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:33:11 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:33:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:33:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:11.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:33:11 np0005539510 python3.9[149969]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 29 01:33:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:13.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:13.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:14 np0005539510 podman[150004]: 2025-11-29 06:33:14.879670973 +0000 UTC m=+0.046759036 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 29 01:33:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:33:15.122 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:33:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:33:15.124 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:33:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:33:15.124 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:33:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:33:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:15.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:33:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:15.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:16 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:33:16 np0005539510 python3.9[150151]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:33:17 np0005539510 python3.9[150236]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:33:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:17.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:17.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:19.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:19.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:21 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:33:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:21.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:33:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:21.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:33:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:23.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:23.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:33:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:25.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:33:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:25.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:26 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:33:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:27.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:27.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:33:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:29.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:33:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:33:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:29.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:33:31 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:33:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:31.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:31.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:33.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:33.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:33:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:35.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:33:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:35.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:36 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:33:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:37.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:33:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:37.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:33:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:39.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:39.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:41 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:33:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:41.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:41.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:41 np0005539510 podman[150496]: 2025-11-29 06:33:41.962700671 +0000 UTC m=+0.115386101 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Nov 29 01:33:42 np0005539510 kernel: SELinux:  Converting 2771 SID table entries...
Nov 29 01:33:42 np0005539510 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 01:33:42 np0005539510 kernel: SELinux:  policy capability open_perms=1
Nov 29 01:33:42 np0005539510 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 01:33:42 np0005539510 kernel: SELinux:  policy capability always_check_network=0
Nov 29 01:33:42 np0005539510 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 01:33:42 np0005539510 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 01:33:42 np0005539510 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 01:33:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:43.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:43.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:33:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:45.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:33:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:45.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:45 np0005539510 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Nov 29 01:33:45 np0005539510 podman[150529]: 2025-11-29 06:33:45.902052649 +0000 UTC m=+0.060203448 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 01:33:46 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:33:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:47.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:33:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:47.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:33:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:49.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:49.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:51 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:33:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:33:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:51.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:33:51 np0005539510 kernel: SELinux:  Converting 2771 SID table entries...
Nov 29 01:33:51 np0005539510 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 01:33:51 np0005539510 kernel: SELinux:  policy capability open_perms=1
Nov 29 01:33:51 np0005539510 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 01:33:51 np0005539510 kernel: SELinux:  policy capability always_check_network=0
Nov 29 01:33:51 np0005539510 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 01:33:51 np0005539510 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 01:33:51 np0005539510 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 01:33:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:33:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:51.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:33:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:53.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:33:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:53.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:33:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:55.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:33:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:55.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:33:56 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:33:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:57.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:33:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:57.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:33:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:33:59.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:33:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:33:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:33:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:33:59.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:01 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:34:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:34:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:01.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:34:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:01.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:34:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:03.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:34:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:03.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:05.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:05.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:06 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:34:06 np0005539510 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Nov 29 01:34:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:07.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:07.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:09.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:09.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:11 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:34:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:11.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:11.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:12 np0005539510 podman[156984]: 2025-11-29 06:34:12.982834668 +0000 UTC m=+0.111412824 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 01:34:13 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:34:13 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:34:13 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:34:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:13.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:13.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:34:15.124 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:34:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:34:15.124 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:34:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:34:15.124 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:34:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:15.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:15.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:16 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:34:16 np0005539510 podman[159745]: 2025-11-29 06:34:16.895845641 +0000 UTC m=+0.054923026 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent)
Nov 29 01:34:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:17.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:17.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:19.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:19.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:21 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:34:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:21.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:21.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:23.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:23.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:25.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:34:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:25.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:34:26 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:34:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:27.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:27.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:29.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:34:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:29.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:34:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:34:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:31.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:34:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:34:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:31.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:34:31 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:34:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:34:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:33.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:34:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:33.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:35.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:35.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:36 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:34:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:37.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:34:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:37.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:34:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:39.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:39.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:41.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:41 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:34:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:41.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:43 np0005539510 podman[167736]: 2025-11-29 06:34:43.153683929 +0000 UTC m=+0.095284719 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 29 01:34:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:43.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:43.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:44 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:34:44 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:34:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:45.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:45.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:46 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:34:47 np0005539510 podman[167818]: 2025-11-29 06:34:47.27986255 +0000 UTC m=+0.047027803 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:34:47 np0005539510 kernel: SELinux:  Converting 2772 SID table entries...
Nov 29 01:34:47 np0005539510 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 01:34:47 np0005539510 kernel: SELinux:  policy capability open_perms=1
Nov 29 01:34:47 np0005539510 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 01:34:47 np0005539510 kernel: SELinux:  policy capability always_check_network=0
Nov 29 01:34:47 np0005539510 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 01:34:47 np0005539510 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 01:34:47 np0005539510 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 01:34:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:47.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:47.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:48 np0005539510 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Nov 29 01:34:48 np0005539510 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Nov 29 01:34:48 np0005539510 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Nov 29 01:34:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:49.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:49.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:51.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:51 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:34:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:51.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:52 np0005539510 ceph-mgr[77504]: client.0 ms_handle_reset on v2:192.168.122.100:6800/1221624088
Nov 29 01:34:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:53.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:53.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:55.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:55.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:56 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:34:56 np0005539510 systemd[1]: Stopping OpenSSH server daemon...
Nov 29 01:34:56 np0005539510 systemd[1]: sshd.service: Deactivated successfully.
Nov 29 01:34:56 np0005539510 systemd[1]: Stopped OpenSSH server daemon.
Nov 29 01:34:56 np0005539510 systemd[1]: sshd.service: Consumed 4.156s CPU time, read 32.0K from disk, written 56.0K to disk.
Nov 29 01:34:56 np0005539510 systemd[1]: Stopped target sshd-keygen.target.
Nov 29 01:34:56 np0005539510 systemd[1]: Stopping sshd-keygen.target...
Nov 29 01:34:56 np0005539510 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 01:34:56 np0005539510 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 01:34:56 np0005539510 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 01:34:56 np0005539510 systemd[1]: Reached target sshd-keygen.target.
Nov 29 01:34:56 np0005539510 systemd[1]: Starting OpenSSH server daemon...
Nov 29 01:34:56 np0005539510 systemd[1]: Started OpenSSH server daemon.
Nov 29 01:34:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:57.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:57.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:58 np0005539510 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 01:34:59 np0005539510 systemd[1]: Starting man-db-cache-update.service...
Nov 29 01:34:59 np0005539510 systemd[1]: Reloading.
Nov 29 01:34:59 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:34:59 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:34:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:34:59.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:34:59 np0005539510 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 01:34:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:34:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:34:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:34:59.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:01.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:01.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:01 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:35:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:35:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:03.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:35:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:03.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:05.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:05.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:06 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:35:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:07.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:07.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:09.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:09.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:09 np0005539510 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 01:35:09 np0005539510 systemd[1]: Finished man-db-cache-update.service.
Nov 29 01:35:09 np0005539510 systemd[1]: man-db-cache-update.service: Consumed 10.421s CPU time.
Nov 29 01:35:09 np0005539510 systemd[1]: run-rb3ab222678514015b742f487412dcd14.service: Deactivated successfully.
Nov 29 01:35:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:11.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:35:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:11.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:35:11 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:35:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:35:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:13.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:35:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:13.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:13 np0005539510 podman[177461]: 2025-11-29 06:35:13.948211909 +0000 UTC m=+0.110485387 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller)
Nov 29 01:35:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:35:15.125 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:35:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:35:15.126 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:35:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:35:15.126 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:35:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:35:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:15.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:35:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:15.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:16 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:35:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:17.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:17.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:17 np0005539510 podman[177490]: 2025-11-29 06:35:17.875551673 +0000 UTC m=+0.042081150 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125)
Nov 29 01:35:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:35:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:19.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:35:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:19.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:21.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:35:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:21.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:35:21 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:35:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:35:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:23.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:35:23 np0005539510 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 01:35:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:35:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:23.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:35:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:35:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:25.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:35:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:25.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:26 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:35:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:27.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:27.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:29.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:29.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:31.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:31.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:31 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:35:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:33.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:33.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:35.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:35.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:36 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:35:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:37.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:37.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:39.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 01:35:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:39.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 01:35:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:41.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:41.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:41 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:35:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:43.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:43.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:44 np0005539510 podman[177704]: 2025-11-29 06:35:44.968534847 +0000 UTC m=+0.122484498 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 01:35:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:45.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:45.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:46 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:35:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:47.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:47.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:48 np0005539510 podman[177783]: 2025-11-29 06:35:48.879629578 +0000 UTC m=+0.047428443 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 29 01:35:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:49.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 01:35:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:49.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 01:35:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 01:35:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:51.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 01:35:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 01:35:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:51.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 01:35:51 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:35:51 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:35:52 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:35:53 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:35:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 01:35:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:53.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 01:35:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:53.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:55.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 01:35:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:55.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 01:35:57 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:35:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 01:35:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:57.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 01:35:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 01:35:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:57.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 01:35:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:35:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:35:59.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:35:59 np0005539510 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 01:35:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:35:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 01:35:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:35:59.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 01:36:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:01.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 01:36:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:01.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 01:36:02 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:36:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 01:36:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:03.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 01:36:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 01:36:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:03.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 01:36:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:05.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:05.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:07 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:36:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 01:36:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:07.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 01:36:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:07.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:09.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:09.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:11.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 01:36:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:11.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 01:36:12 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:36:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 01:36:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:13.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 01:36:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 01:36:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:13.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 01:36:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:36:15.126 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:36:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:36:15.127 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:36:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:36:15.127 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:36:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:15.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 01:36:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:15.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 01:36:15 np0005539510 podman[177866]: 2025-11-29 06:36:15.965084758 +0000 UTC m=+0.113858617 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 29 01:36:17 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:36:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:17.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:17.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 01:36:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:19.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 01:36:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:19.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:19 np0005539510 podman[177892]: 2025-11-29 06:36:19.922264559 +0000 UTC m=+0.079934259 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent)
Nov 29 01:36:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 01:36:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:21.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 01:36:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 01:36:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:21.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 01:36:22 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:36:23 np0005539510 systemd[1]: session-48.scope: Deactivated successfully.
Nov 29 01:36:23 np0005539510 systemd[1]: session-48.scope: Consumed 1min 54.847s CPU time.
Nov 29 01:36:23 np0005539510 systemd-logind[784]: Session 48 logged out. Waiting for processes to exit.
Nov 29 01:36:23 np0005539510 systemd-logind[784]: Removed session 48.
Nov 29 01:36:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:23.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:23.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000024s ======
Nov 29 01:36:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:25.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Nov 29 01:36:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000023s ======
Nov 29 01:36:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:25.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000023s
Nov 29 01:36:27 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:36:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.002000046s ======
Nov 29 01:36:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:27.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000046s
Nov 29 01:36:27 np0005539510 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 01:36:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:27.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:29.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:36:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:29.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:36:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:31.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:36:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:31.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:36:32 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:36:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:33.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:36:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:33.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:36:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:35.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:35.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:37.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:37.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:37 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:36:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:39.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:39.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:40 np0005539510 systemd-logind[784]: New session 49 of user zuul.
Nov 29 01:36:40 np0005539510 systemd[1]: Started Session 49 of User zuul.
Nov 29 01:36:40 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:36:41 np0005539510 python3.9[178105]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:36:41 np0005539510 systemd[1]: Reloading.
Nov 29 01:36:41 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:41 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:36:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:41.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:36:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:41.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:41 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:36:42 np0005539510 python3.9[178345]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:36:42 np0005539510 systemd[1]: Reloading.
Nov 29 01:36:42 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:42 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:42 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:36:43 np0005539510 python3.9[178537]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:36:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:43.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:43.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:44 np0005539510 systemd[1]: Reloading.
Nov 29 01:36:44 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:44 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:45.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:45 np0005539510 python3.9[178728]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:36:45 np0005539510 systemd[1]: Reloading.
Nov 29 01:36:45 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:45 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:45.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:46 np0005539510 podman[178793]: 2025-11-29 06:36:46.942973959 +0000 UTC m=+0.094992141 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:36:47 np0005539510 python3.9[178947]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:47.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:47 np0005539510 systemd[1]: Reloading.
Nov 29 01:36:47 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:47 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:47.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:47 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:36:48 np0005539510 python3.9[179186]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:48 np0005539510 systemd[1]: Reloading.
Nov 29 01:36:48 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:48 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:49.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:49 np0005539510 python3.9[179377]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:49 np0005539510 systemd[1]: Reloading.
Nov 29 01:36:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:49.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:49 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:49 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:50 np0005539510 podman[179416]: 2025-11-29 06:36:50.163892602 +0000 UTC m=+0.053973033 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 29 01:36:50 np0005539510 python3.9[179587]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:51.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:51 np0005539510 python3.9[179742]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:51 np0005539510 systemd[1]: Reloading.
Nov 29 01:36:51 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:51 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:51.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:52 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:36:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:36:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:53.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:36:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:53.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:53 np0005539510 python3.9[179933]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:36:53 np0005539510 systemd[1]: Reloading.
Nov 29 01:36:53 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:53 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:55.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:55.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:56 np0005539510 systemd[1]: Listening on libvirt proxy daemon socket.
Nov 29 01:36:56 np0005539510 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Nov 29 01:36:56 np0005539510 python3.9[180129]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:57 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Nov 29 01:36:57 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:57.476897) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 01:36:57 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Nov 29 01:36:57 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398217476961, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 2443, "num_deletes": 251, "total_data_size": 6349708, "memory_usage": 6431728, "flush_reason": "Manual Compaction"}
Nov 29 01:36:57 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Nov 29 01:36:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:36:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:57.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:36:57 np0005539510 python3.9[180284]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:57.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:36:58 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398218109138, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 4155680, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10636, "largest_seqno": 13074, "table_properties": {"data_size": 4145762, "index_size": 6348, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 19650, "raw_average_key_size": 20, "raw_value_size": 4125989, "raw_average_value_size": 4205, "num_data_blocks": 284, "num_entries": 981, "num_filter_entries": 981, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397891, "oldest_key_time": 1764397891, "file_creation_time": 1764398217, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:36:58 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 632333 microseconds, and 9962 cpu microseconds.
Nov 29 01:36:58 np0005539510 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:36:58 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:58.109223) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 4155680 bytes OK
Nov 29 01:36:58 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:58.109258) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Nov 29 01:36:58 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:58.113621) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Nov 29 01:36:58 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:58.113673) EVENT_LOG_v1 {"time_micros": 1764398218113660, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 01:36:58 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:58.113705) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 01:36:58 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 6339244, prev total WAL file size 6339244, number of live WAL files 2.
Nov 29 01:36:58 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:36:58 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:36:58 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:58.115852) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Nov 29 01:36:58 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 01:36:58 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(4058KB)], [21(9323KB)]
Nov 29 01:36:58 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398218115921, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 13702897, "oldest_snapshot_seqno": -1}
Nov 29 01:36:58 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 4464 keys, 10609735 bytes, temperature: kUnknown
Nov 29 01:36:58 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398218425311, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 10609735, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10574934, "index_size": 22531, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11205, "raw_key_size": 109201, "raw_average_key_size": 24, "raw_value_size": 10489328, "raw_average_value_size": 2349, "num_data_blocks": 972, "num_entries": 4464, "num_filter_entries": 4464, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397158, "oldest_key_time": 0, "file_creation_time": 1764398218, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:36:58 np0005539510 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:36:58 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:58.425516) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 10609735 bytes
Nov 29 01:36:58 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:58.426941) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 44.3 rd, 34.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.0, 9.1 +0.0 blob) out(10.1 +0.0 blob), read-write-amplify(5.9) write-amplify(2.6) OK, records in: 4982, records dropped: 518 output_compression: NoCompression
Nov 29 01:36:58 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:58.426988) EVENT_LOG_v1 {"time_micros": 1764398218426968, "job": 10, "event": "compaction_finished", "compaction_time_micros": 309449, "compaction_time_cpu_micros": 25577, "output_level": 6, "num_output_files": 1, "total_output_size": 10609735, "num_input_records": 4982, "num_output_records": 4464, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 01:36:58 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:36:58 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398218428125, "job": 10, "event": "table_file_deletion", "file_number": 23}
Nov 29 01:36:58 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:36:58 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398218430488, "job": 10, "event": "table_file_deletion", "file_number": 21}
Nov 29 01:36:58 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:58.115702) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:36:58 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:58.430570) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:36:58 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:58.430575) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:36:58 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:58.430577) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:36:58 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:58.430579) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:36:58 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:36:58.430581) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:36:58 np0005539510 python3.9[180440]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:59 np0005539510 python3.9[180595]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:36:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:36:59.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:36:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:36:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:36:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:36:59.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:00 np0005539510 python3.9[180750]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:37:01 np0005539510 python3.9[180906]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:37:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:37:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:01.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:37:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:01.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:01 np0005539510 python3.9[181061]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:37:02 np0005539510 python3.9[181217]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:37:03 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:37:03 np0005539510 python3.9[181372]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:37:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:37:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:03.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:37:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:03.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:04 np0005539510 python3.9[181527]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:37:05 np0005539510 python3.9[181683]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:37:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:05.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:37:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:05.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:37:06 np0005539510 python3.9[181838]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:37:06 np0005539510 python3.9[181994]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:37:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:07.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:07 np0005539510 python3.9[182149]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:37:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:07.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:08 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:37:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:09.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:09.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:11 np0005539510 python3.9[182356]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:37:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:11.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:11 np0005539510 python3.9[182508]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:37:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:11.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:12 np0005539510 python3.9[182660]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:37:12 np0005539510 python3.9[182813]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:37:13 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:37:13 np0005539510 python3.9[182965]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:37:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:37:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:13.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:37:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:13.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:14 np0005539510 python3.9[183117]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:37:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:37:15.127 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:37:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:37:15.128 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:37:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:37:15.128 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:37:15 np0005539510 python3.9[183270]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:15.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:15.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:16 np0005539510 python3.9[183395]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398234.9310288-1633-68474357176721/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:16 np0005539510 python3.9[183548]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:17 np0005539510 podman[183645]: 2025-11-29 06:37:17.321016546 +0000 UTC m=+0.103284012 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 29 01:37:17 np0005539510 python3.9[183688]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398236.3487127-1633-251623601574031/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:17.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:17.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:18 np0005539510 python3.9[183850]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:18 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:37:18 np0005539510 python3.9[183975]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398237.5805945-1633-112677451003039/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:19 np0005539510 python3.9[184128]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:19.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:19 np0005539510 python3.9[184253]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398238.671048-1633-267694321109828/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:19.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:20 np0005539510 podman[184405]: 2025-11-29 06:37:20.243783308 +0000 UTC m=+0.049941286 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 01:37:20 np0005539510 python3.9[184406]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:21 np0005539510 python3.9[184550]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398239.8748834-1633-171931559811636/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:21.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:21 np0005539510 python3.9[184702]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:21.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:22 np0005539510 python3.9[184829]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398241.2906885-1633-214051068524991/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:22 np0005539510 python3.9[184982]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:23 np0005539510 python3.9[185105]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398242.4341588-1633-267192192703320/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:23 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:37:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:23.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:37:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:23.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:37:23 np0005539510 python3.9[185257]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:24 np0005539510 python3.9[185382]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398243.5279498-1633-28039942833854/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:25.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:25.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:27 np0005539510 python3.9[185536]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Nov 29 01:37:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:27.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:27.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:29.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:29.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:30 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:37:31 np0005539510 python3.9[185741]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:31.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:31 np0005539510 python3.9[185893]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:31.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:32 np0005539510 python3.9[186045]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:32 np0005539510 python3.9[186198]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:33.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:33 np0005539510 python3.9[186350]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:33.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:34 np0005539510 python3.9[186502]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:34 np0005539510 python3.9[186655]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:35 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:37:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:37:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:35.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:37:35 np0005539510 python3.9[186807]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:37:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:35.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:37:36 np0005539510 python3.9[186959]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:37 np0005539510 python3.9[187112]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:37.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:37.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:38 np0005539510 python3.9[187264]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:38 np0005539510 python3.9[187417]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:39 np0005539510 python3.9[187569]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:37:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:39.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:37:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:39.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:39 np0005539510 python3.9[187721]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:40 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:37:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:41.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:41.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:42 np0005539510 python3.9[188004]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:42 np0005539510 python3.9[188128]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398261.6236115-2296-210325227478095/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:43 np0005539510 python3.9[188280]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:43.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:43 np0005539510 python3.9[188403]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398262.8625185-2296-229010145427433/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:43.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:44 np0005539510 python3.9[188555]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:45 np0005539510 python3.9[188679]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398264.0138283-2296-222085802431806/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:45 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:37:45 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:37:45 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:37:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:45.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:45 np0005539510 python3.9[188831]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:45.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:46 np0005539510 python3.9[188954]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398265.2459807-2296-234555465417789/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:47 np0005539510 python3.9[189107]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:47 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:37:47 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:37:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000032s ======
Nov 29 01:37:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:47.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Nov 29 01:37:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:47.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:47 np0005539510 podman[189178]: 2025-11-29 06:37:47.973114535 +0000 UTC m=+0.126441976 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:37:48 np0005539510 python3.9[189256]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398266.4155471-2296-61651620805441/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:48 np0005539510 python3.9[189409]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:48 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:37:48 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:37:48 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:37:49 np0005539510 python3.9[189582]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398268.321921-2296-78086711225755/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:49.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:49.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:50 np0005539510 python3.9[189734]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:50 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:37:50 np0005539510 podman[189829]: 2025-11-29 06:37:50.47070735 +0000 UTC m=+0.064079326 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:37:50 np0005539510 python3.9[189875]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398269.6306927-2296-66826701042962/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:51 np0005539510 python3.9[190029]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:51.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:51 np0005539510 python3.9[190152]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398270.8204887-2296-211885001576689/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:51.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:52 np0005539510 python3.9[190304]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:52 np0005539510 python3.9[190428]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398271.9884522-2296-154282123026614/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000031s ======
Nov 29 01:37:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:53.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Nov 29 01:37:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000032s ======
Nov 29 01:37:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:53.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Nov 29 01:37:53 np0005539510 python3.9[190580]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:54 np0005539510 python3.9[190703]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398273.0850105-2296-78084349766886/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:54 np0005539510 auditd[699]: Audit daemon rotating log files
Nov 29 01:37:55 np0005539510 python3.9[190856]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:55 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:37:55 np0005539510 python3.9[190979]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398274.5954678-2296-74899851920383/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000031s ======
Nov 29 01:37:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:55.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Nov 29 01:37:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:55.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:56 np0005539510 python3.9[191132]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:57 np0005539510 python3.9[191255]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398275.825394-2296-194733351450830/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:57.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:57.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:58 np0005539510 python3.9[191407]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:58 np0005539510 python3.9[191531]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398277.8263867-2296-192359018781522/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:59 np0005539510 python3.9[191683]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:37:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:37:59.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:37:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:37:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000031s ======
Nov 29 01:37:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:37:59.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Nov 29 01:38:00 np0005539510 python3.9[191806]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398279.0187278-2296-239034984973653/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:00 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:38:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:01.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:01.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:02 np0005539510 python3.9[191957]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:38:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:03.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:03.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:04 np0005539510 python3.9[192113]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Nov 29 01:38:05 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:38:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:05.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:05.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:07 np0005539510 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Nov 29 01:38:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000031s ======
Nov 29 01:38:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:07.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Nov 29 01:38:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:07.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:08 np0005539510 python3.9[192322]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:09 np0005539510 python3.9[192524]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000031s ======
Nov 29 01:38:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:09.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Nov 29 01:38:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:09.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:10 np0005539510 python3.9[192676]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:10 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:38:10 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:38:10 np0005539510 python3.9[192830]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:11 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:38:11 np0005539510 python3.9[192982]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:11.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:11.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:12 np0005539510 python3.9[193134]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:13 np0005539510 python3.9[193287]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000031s ======
Nov 29 01:38:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:13.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Nov 29 01:38:13 np0005539510 python3.9[193439]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:13.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:14 np0005539510 python3.9[193591]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:38:15.129 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:38:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:38:15.130 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:38:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:38:15.130 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:38:15 np0005539510 python3.9[193744]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:15 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:38:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:15.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:15.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:16 np0005539510 python3.9[193896]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:38:16 np0005539510 systemd[1]: Reloading.
Nov 29 01:38:16 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:38:16 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:38:16 np0005539510 systemd[1]: Starting libvirt logging daemon socket...
Nov 29 01:38:16 np0005539510 systemd[1]: Listening on libvirt logging daemon socket.
Nov 29 01:38:16 np0005539510 systemd[1]: Starting libvirt logging daemon admin socket...
Nov 29 01:38:16 np0005539510 systemd[1]: Listening on libvirt logging daemon admin socket.
Nov 29 01:38:16 np0005539510 systemd[1]: Starting libvirt logging daemon...
Nov 29 01:38:16 np0005539510 systemd[1]: Started libvirt logging daemon.
Nov 29 01:38:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:17.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:17 np0005539510 python3.9[194090]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:38:17 np0005539510 systemd[1]: Reloading.
Nov 29 01:38:17 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:38:17 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:38:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:17.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:18 np0005539510 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Nov 29 01:38:18 np0005539510 systemd[1]: Starting libvirt nodedev daemon socket...
Nov 29 01:38:18 np0005539510 systemd[1]: Listening on libvirt nodedev daemon socket.
Nov 29 01:38:18 np0005539510 systemd[1]: Starting libvirt nodedev daemon admin socket...
Nov 29 01:38:18 np0005539510 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Nov 29 01:38:18 np0005539510 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Nov 29 01:38:18 np0005539510 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Nov 29 01:38:18 np0005539510 systemd[1]: Starting libvirt nodedev daemon...
Nov 29 01:38:18 np0005539510 systemd[1]: Started libvirt nodedev daemon.
Nov 29 01:38:18 np0005539510 podman[194127]: 2025-11-29 06:38:18.213601642 +0000 UTC m=+0.159232301 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 01:38:18 np0005539510 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Nov 29 01:38:18 np0005539510 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Nov 29 01:38:18 np0005539510 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Nov 29 01:38:18 np0005539510 python3.9[194339]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:38:18 np0005539510 systemd[1]: Reloading.
Nov 29 01:38:19 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:38:19 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:38:19 np0005539510 systemd[1]: Starting libvirt proxy daemon admin socket...
Nov 29 01:38:19 np0005539510 systemd[1]: Starting libvirt proxy daemon read-only socket...
Nov 29 01:38:19 np0005539510 systemd[1]: Listening on libvirt proxy daemon admin socket.
Nov 29 01:38:19 np0005539510 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Nov 29 01:38:19 np0005539510 systemd[1]: Starting libvirt proxy daemon...
Nov 29 01:38:19 np0005539510 systemd[1]: Started libvirt proxy daemon.
Nov 29 01:38:19 np0005539510 setroubleshoot[194128]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 745ff3a9-6485-4737-8d50-c2f2d563dc3c
Nov 29 01:38:19 np0005539510 setroubleshoot[194128]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Nov 29 01:38:19 np0005539510 setroubleshoot[194128]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 745ff3a9-6485-4737-8d50-c2f2d563dc3c
Nov 29 01:38:19 np0005539510 setroubleshoot[194128]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Nov 29 01:38:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:19.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:19.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:20 np0005539510 python3.9[194552]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:38:20 np0005539510 systemd[1]: Reloading.
Nov 29 01:38:20 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:38:20 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:38:20 np0005539510 systemd[1]: Listening on libvirt locking daemon socket.
Nov 29 01:38:20 np0005539510 systemd[1]: Starting libvirt QEMU daemon socket...
Nov 29 01:38:20 np0005539510 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Nov 29 01:38:20 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:38:20 np0005539510 systemd[1]: Starting Virtual Machine and Container Registration Service...
Nov 29 01:38:20 np0005539510 systemd[1]: Listening on libvirt QEMU daemon socket.
Nov 29 01:38:20 np0005539510 systemd[1]: Starting libvirt QEMU daemon admin socket...
Nov 29 01:38:20 np0005539510 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Nov 29 01:38:20 np0005539510 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Nov 29 01:38:20 np0005539510 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Nov 29 01:38:20 np0005539510 systemd[1]: Started Virtual Machine and Container Registration Service.
Nov 29 01:38:20 np0005539510 systemd[1]: Starting libvirt QEMU daemon...
Nov 29 01:38:20 np0005539510 podman[194590]: 2025-11-29 06:38:20.669198575 +0000 UTC m=+0.053438713 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 29 01:38:20 np0005539510 systemd[1]: Started libvirt QEMU daemon.
Nov 29 01:38:21 np0005539510 python3.9[194787]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:38:21 np0005539510 systemd[1]: Reloading.
Nov 29 01:38:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:21.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:21 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:38:21 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:38:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:21.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:21 np0005539510 systemd[1]: Starting libvirt secret daemon socket...
Nov 29 01:38:21 np0005539510 systemd[1]: Listening on libvirt secret daemon socket.
Nov 29 01:38:21 np0005539510 systemd[1]: Starting libvirt secret daemon admin socket...
Nov 29 01:38:21 np0005539510 systemd[1]: Starting libvirt secret daemon read-only socket...
Nov 29 01:38:21 np0005539510 systemd[1]: Listening on libvirt secret daemon admin socket.
Nov 29 01:38:21 np0005539510 systemd[1]: Listening on libvirt secret daemon read-only socket.
Nov 29 01:38:21 np0005539510 systemd[1]: Starting libvirt secret daemon...
Nov 29 01:38:21 np0005539510 systemd[1]: Started libvirt secret daemon.
Nov 29 01:38:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000032s ======
Nov 29 01:38:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:23.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Nov 29 01:38:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:23.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:24 np0005539510 python3.9[195002]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:25 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:38:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:25.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:25 np0005539510 python3.9[195154]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 01:38:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000032s ======
Nov 29 01:38:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:25.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Nov 29 01:38:26 np0005539510 python3.9[195307]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;#012echo ceph#012awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:38:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000031s ======
Nov 29 01:38:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:27.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Nov 29 01:38:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:27.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:28 np0005539510 python3.9[195461]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 01:38:29 np0005539510 python3.9[195633]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:29 np0005539510 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Nov 29 01:38:29 np0005539510 systemd[1]: setroubleshootd.service: Deactivated successfully.
Nov 29 01:38:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:29.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:29 np0005539510 python3.9[195783]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398308.7991729-3370-237674090291383/.source.xml follow=False _original_basename=secret.xml.j2 checksum=63744b3abb892aaab98ed7226f328ffc66ff66bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000031s ======
Nov 29 01:38:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:29.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Nov 29 01:38:30 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:38:30 np0005539510 python3.9[195936]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine 336ec58c-893b-528f-a0c1-6ed1196bc047#012virsh secret-define --file /tmp/secret.xml#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:38:31 np0005539510 python3.9[196098]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000031s ======
Nov 29 01:38:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:31.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Nov 29 01:38:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:31.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.002000062s ======
Nov 29 01:38:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:33.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000062s
Nov 29 01:38:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000031s ======
Nov 29 01:38:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:33.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Nov 29 01:38:34 np0005539510 python3.9[196562]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:34 np0005539510 python3.9[196715]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:35 np0005539510 python3.9[196838]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398314.4277127-3536-46108208255348/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:35 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:38:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:35.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:35.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:36 np0005539510 python3.9[196990]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:37 np0005539510 python3.9[197143]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:38:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:37.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:38:37 np0005539510 python3.9[197221]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:38:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:37.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:38:38 np0005539510 python3.9[197373]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:38 np0005539510 python3.9[197452]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.uzdfzrlg recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:38:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:39.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:38:39 np0005539510 python3.9[197604]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:38:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:39.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:38:40 np0005539510 python3.9[197682]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:40 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:38:41 np0005539510 python3.9[197835]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:38:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:38:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:41.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:38:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:38:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:41.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:38:42 np0005539510 python3[197989]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 29 01:38:43 np0005539510 python3.9[198141]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:38:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:43.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:38:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:43.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:43 np0005539510 python3.9[198219]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:44 np0005539510 python3.9[198372]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:45 np0005539510 python3.9[198450]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:45 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:38:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:38:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:45.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:38:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:45.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:46 np0005539510 python3.9[198602]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:46 np0005539510 python3.9[198680]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:47 np0005539510 python3.9[198833]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:47.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:38:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:47.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:38:48 np0005539510 python3.9[198911]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:48 np0005539510 podman[199036]: 2025-11-29 06:38:48.949626026 +0000 UTC m=+0.106376866 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 29 01:38:49 np0005539510 python3.9[199080]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:49 np0005539510 python3.9[199265]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398328.4041808-3910-213663633046478/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:38:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:49.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:38:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:49.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:50 np0005539510 python3.9[199417]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:50 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:38:50 np0005539510 podman[199494]: 2025-11-29 06:38:50.893690236 +0000 UTC m=+0.049553183 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125)
Nov 29 01:38:51 np0005539510 python3.9[199589]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:38:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:51.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:51.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:52 np0005539510 python3.9[199744]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:53 np0005539510 python3.9[199897]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:38:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:53.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:38:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:53.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:38:54 np0005539510 python3.9[200050]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:38:55 np0005539510 python3.9[200205]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:38:55 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:38:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:38:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:55.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:38:55 np0005539510 python3.9[200360]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:55.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:56 np0005539510 python3.9[200513]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:57 np0005539510 python3.9[200636]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398336.1805663-4126-162989859395076/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:57.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:57.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:38:58 np0005539510 python3.9[200788]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:59 np0005539510 python3.9[200912]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398337.9224632-4170-273631381895449/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:38:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:38:59.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:38:59 np0005539510 python3.9[201064]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:38:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:38:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:38:59.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:00 np0005539510 python3.9[201187]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398339.3346808-4216-36531274254630/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:00 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:39:01 np0005539510 python3.9[201340]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:39:01 np0005539510 systemd[1]: Reloading.
Nov 29 01:39:01 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:39:01 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:39:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:01.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:01.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:01 np0005539510 systemd[1]: Reached target edpm_libvirt.target.
Nov 29 01:39:02 np0005539510 python3.9[201533]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 29 01:39:02 np0005539510 systemd[1]: Reloading.
Nov 29 01:39:03 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:39:03 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:39:03 np0005539510 systemd[1]: Reloading.
Nov 29 01:39:03 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:39:03 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:39:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:03.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:39:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:03.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:39:04 np0005539510 systemd[1]: session-49.scope: Deactivated successfully.
Nov 29 01:39:04 np0005539510 systemd[1]: session-49.scope: Consumed 1min 27.908s CPU time.
Nov 29 01:39:04 np0005539510 systemd-logind[784]: Session 49 logged out. Waiting for processes to exit.
Nov 29 01:39:04 np0005539510 systemd-logind[784]: Removed session 49.
Nov 29 01:39:05 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:39:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:39:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:05.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:39:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:05.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:07.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:39:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:07.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:39:09 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:39:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:39:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:09.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:39:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:09.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:10 np0005539510 systemd-logind[784]: New session 50 of user zuul.
Nov 29 01:39:10 np0005539510 systemd[1]: Started Session 50 of User zuul.
Nov 29 01:39:10 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:39:10 np0005539510 python3.9[201970]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:39:11 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:39:11 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:39:11 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:39:11 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:39:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:11.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:11.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:12 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:39:12 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:39:12 np0005539510 python3.9[202125]: ansible-ansible.builtin.service_facts Invoked
Nov 29 01:39:12 np0005539510 network[202142]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:39:12 np0005539510 network[202143]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:39:12 np0005539510 network[202144]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:39:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:39:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:13.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:39:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:13.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:39:15.130 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:39:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:39:15.131 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:39:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:39:15.132 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:39:15 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:39:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:15.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:39:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:15.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:39:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:17.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:17 np0005539510 python3.9[202418]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:39:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:39:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:17.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:39:18 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 01:39:18 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.0 total, 600.0 interval#012Cumulative writes: 2481 writes, 14K keys, 2481 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.03 MB/s#012Cumulative WAL: 2481 writes, 2481 syncs, 1.00 writes per sync, written: 0.03 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1115 writes, 4810 keys, 1115 commit groups, 1.0 writes per commit group, ingest: 11.88 MB, 0.02 MB/s#012Interval WAL: 1115 writes, 1115 syncs, 1.00 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     23.4      0.73              0.04         5    0.146       0      0       0.0       0.0#012  L6      1/0   10.12 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   2.2     90.1     76.3      0.50              0.10         4    0.125     17K   1774       0.0       0.0#012 Sum      1/0   10.12 MB   0.0      0.0     0.0      0.0       0.1      0.0       0.0   3.2     36.6     44.9      1.23              0.14         9    0.136     17K   1774       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   4.2     23.9     25.0      1.01              0.06         4    0.252    9503   1038       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0     90.1     76.3      0.50              0.10         4    0.125     17K   1774       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     23.4      0.73              0.04         4    0.182       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.017, interval 0.006#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.05 GB write, 0.05 MB/s write, 0.04 GB read, 0.04 MB/s read, 1.2 seconds#012Interval compaction: 0.02 GB write, 0.04 MB/s write, 0.02 GB read, 0.04 MB/s read, 1.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55be896f31f0#2 capacity: 304.00 MB usage: 1.52 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 8.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(68,1.34 MB,0.441245%) FilterBlock(9,58.36 KB,0.0187472%) IndexBlock(9,128.73 KB,0.0413543%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Nov 29 01:39:18 np0005539510 python3.9[202503]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:39:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:19.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:19 np0005539510 podman[202505]: 2025-11-29 06:39:19.92931298 +0000 UTC m=+0.084818711 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2)
Nov 29 01:39:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:39:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:19.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:39:20 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:39:21 np0005539510 podman[202557]: 2025-11-29 06:39:21.474833414 +0000 UTC m=+0.056449605 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 01:39:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:39:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:21.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:39:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:21.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:22 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:39:22 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:39:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:23.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:23.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:25 np0005539510 python3.9[202755]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:39:25 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:39:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:39:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:25.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:39:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:25.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:26 np0005539510 python3.9[202907]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:39:26 np0005539510 python3.9[203061]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:39:27 np0005539510 python3.9[203213]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:39:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:27.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:39:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:27.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:39:28 np0005539510 python3.9[203368]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.069780) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398369069859, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 1500, "num_deletes": 250, "total_data_size": 3710953, "memory_usage": 3747472, "flush_reason": "Manual Compaction"}
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398369082086, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 1461897, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13079, "largest_seqno": 14574, "table_properties": {"data_size": 1457009, "index_size": 2284, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 12208, "raw_average_key_size": 20, "raw_value_size": 1446470, "raw_average_value_size": 2402, "num_data_blocks": 103, "num_entries": 602, "num_filter_entries": 602, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764398218, "oldest_key_time": 1764398218, "file_creation_time": 1764398369, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 12350 microseconds, and 5477 cpu microseconds.
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.082137) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 1461897 bytes OK
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.082160) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.084550) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.084575) EVENT_LOG_v1 {"time_micros": 1764398369084567, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.084599) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 3704043, prev total WAL file size 3704043, number of live WAL files 2.
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.085936) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323533' seq:72057594037927935, type:22 .. '6D67727374617400353034' seq:0, type:0; will stop at (end)
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(1427KB)], [24(10MB)]
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398369086028, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 12071632, "oldest_snapshot_seqno": -1}
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 4613 keys, 9170191 bytes, temperature: kUnknown
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398369166938, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 9170191, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9137089, "index_size": 20448, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11589, "raw_key_size": 112551, "raw_average_key_size": 24, "raw_value_size": 9051483, "raw_average_value_size": 1962, "num_data_blocks": 883, "num_entries": 4613, "num_filter_entries": 4613, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397158, "oldest_key_time": 0, "file_creation_time": 1764398369, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.167275) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 9170191 bytes
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.169163) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 149.0 rd, 113.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 10.1 +0.0 blob) out(8.7 +0.0 blob), read-write-amplify(14.5) write-amplify(6.3) OK, records in: 5066, records dropped: 453 output_compression: NoCompression
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.169200) EVENT_LOG_v1 {"time_micros": 1764398369169183, "job": 12, "event": "compaction_finished", "compaction_time_micros": 81011, "compaction_time_cpu_micros": 38004, "output_level": 6, "num_output_files": 1, "total_output_size": 9170191, "num_input_records": 5066, "num_output_records": 4613, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398369169900, "job": 12, "event": "table_file_deletion", "file_number": 26}
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398369173597, "job": 12, "event": "table_file_deletion", "file_number": 24}
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.085802) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.173770) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.173776) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.173778) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.173780) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:39:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:39:29.173782) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:39:29 np0005539510 python3.9[203492]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398367.9435863-253-236948993648395/.source.iscsi _original_basename=.792wyc72 follow=False checksum=e97d79e3c2fa6d72bfed153b7b0babee6d736c42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:29.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:29.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:30 np0005539510 python3.9[203694]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:30 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:39:31 np0005539510 python3.9[203847]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:31.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:32.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:32 np0005539510 python3.9[203999]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:39:32 np0005539510 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Nov 29 01:39:33 np0005539510 python3.9[204156]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:39:33 np0005539510 systemd[1]: Reloading.
Nov 29 01:39:33 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:39:33 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:39:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:33.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:33 np0005539510 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 29 01:39:33 np0005539510 systemd[1]: Starting Open-iSCSI...
Nov 29 01:39:33 np0005539510 kernel: Loading iSCSI transport class v2.0-870.
Nov 29 01:39:33 np0005539510 systemd[1]: Started Open-iSCSI.
Nov 29 01:39:33 np0005539510 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Nov 29 01:39:33 np0005539510 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Nov 29 01:39:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:34.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:35 np0005539510 python3.9[204358]: ansible-ansible.builtin.service_facts Invoked
Nov 29 01:39:35 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:39:35 np0005539510 network[204375]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:39:35 np0005539510 network[204376]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:39:35 np0005539510 network[204377]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:39:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:35.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:36.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:37.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:39:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:38.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:39:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:39.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:40.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:40 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:39:40 np0005539510 python3.9[204652]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 29 01:39:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:39:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:41.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:39:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:39:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:42.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:39:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:43.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:39:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:44.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:39:45 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:39:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:45.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:45 np0005539510 python3.9[204806]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Nov 29 01:39:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:46.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:47 np0005539510 python3.9[204963]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:39:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:47.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:47 np0005539510 python3.9[205086]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398386.488407-483-8029253838537/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:48.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:48 np0005539510 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 01:39:48 np0005539510 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 5796 writes, 24K keys, 5796 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 5796 writes, 923 syncs, 6.28 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 443 writes, 694 keys, 443 commit groups, 1.0 writes per commit group, ingest: 0.22 MB, 0.00 MB/s#012Interval WAL: 443 writes, 211 syncs, 2.10 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_s
Nov 29 01:39:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:49.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:50 np0005539510 python3.9[205277]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:50.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:50 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:39:50 np0005539510 podman[205367]: 2025-11-29 06:39:50.960666899 +0000 UTC m=+0.104640272 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 01:39:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:39:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:51.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:39:51 np0005539510 podman[205397]: 2025-11-29 06:39:51.899788693 +0000 UTC m=+0.061899538 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 01:39:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:52.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:53.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:54.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:55 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:39:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:55.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:39:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:56.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:39:57 np0005539510 python3.9[205494]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:39:57 np0005539510 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 29 01:39:57 np0005539510 systemd[1]: Stopped Load Kernel Modules.
Nov 29 01:39:57 np0005539510 systemd[1]: Stopping Load Kernel Modules...
Nov 29 01:39:57 np0005539510 systemd[1]: Starting Load Kernel Modules...
Nov 29 01:39:57 np0005539510 systemd[1]: Finished Load Kernel Modules.
Nov 29 01:39:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:57.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:39:58.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:39:59 np0005539510 python3.9[205651]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:39:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:39:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:39:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:39:59.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:00.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:00 np0005539510 python3.9[205803]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:40:00 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:40:01 np0005539510 ceph-mon[77142]: overall HEALTH_OK
Nov 29 01:40:01 np0005539510 python3.9[205956]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:40:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:01.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:02.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:02 np0005539510 python3.9[206109]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:40:03 np0005539510 python3.9[206232]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398401.5194995-658-63452576492630/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:03.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:04.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:04 np0005539510 python3.9[206384]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:40:05 np0005539510 python3.9[206538]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:05 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:40:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:05.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:06.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:06 np0005539510 python3.9[206691]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:07 np0005539510 python3.9[206843]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:40:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:07.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:40:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:08.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:08 np0005539510 python3.9[206995]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:09 np0005539510 python3.9[207148]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:09.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:10 np0005539510 python3.9[207300]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:10.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:10 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:40:10 np0005539510 python3.9[207503]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:11.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:12 np0005539510 python3.9[207655]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:40:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:12.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:12 np0005539510 python3.9[207810]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:13.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:14.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:14 np0005539510 python3.9[207962]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:14 np0005539510 python3.9[208115]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:40:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:40:15.131 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:40:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:40:15.132 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:40:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:40:15.132 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:40:15 np0005539510 python3.9[208193]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:15 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:40:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:15.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:15 np0005539510 python3.9[208345]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:40:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:16.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:16 np0005539510 python3.9[208423]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:17 np0005539510 python3.9[208576]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:17.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:18.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:18 np0005539510 systemd[1]: virtnodedevd.service: Deactivated successfully.
Nov 29 01:40:18 np0005539510 python3.9[208729]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:40:18 np0005539510 python3.9[208808]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:19 np0005539510 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 29 01:40:19 np0005539510 python3.9[208961]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:40:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:40:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:19.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:40:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:20.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:20 np0005539510 python3.9[209039]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:20 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:40:21 np0005539510 python3.9[209192]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:40:21 np0005539510 systemd[1]: Reloading.
Nov 29 01:40:21 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:40:21 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:40:21 np0005539510 podman[209194]: 2025-11-29 06:40:21.313770729 +0000 UTC m=+0.082746184 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 01:40:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:21.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:22.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:22 np0005539510 podman[209395]: 2025-11-29 06:40:22.066746235 +0000 UTC m=+0.081157151 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Nov 29 01:40:22 np0005539510 python3.9[209594]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:40:23 np0005539510 python3.9[209745]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:23 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:40:23 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:40:23 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Nov 29 01:40:23 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Nov 29 01:40:23 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:40:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:23.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:24.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:24 np0005539510 python3.9[209909]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.342195) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398424342236, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 734, "num_deletes": 252, "total_data_size": 1398800, "memory_usage": 1425104, "flush_reason": "Manual Compaction"}
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398424348967, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 924399, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14579, "largest_seqno": 15308, "table_properties": {"data_size": 920862, "index_size": 1381, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 6914, "raw_average_key_size": 16, "raw_value_size": 913841, "raw_average_value_size": 2191, "num_data_blocks": 63, "num_entries": 417, "num_filter_entries": 417, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764398369, "oldest_key_time": 1764398369, "file_creation_time": 1764398424, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 6825 microseconds, and 2849 cpu microseconds.
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.349017) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 924399 bytes OK
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.349039) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.350426) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.350441) EVENT_LOG_v1 {"time_micros": 1764398424350437, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.350456) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 1394898, prev total WAL file size 1394898, number of live WAL files 2.
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.351044) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760030' seq:72057594037927935, type:22 .. '6B7600323533' seq:0, type:0; will stop at (end)
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(902KB)], [27(8955KB)]
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398424351105, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 10094590, "oldest_snapshot_seqno": -1}
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 4513 keys, 9525612 bytes, temperature: kUnknown
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398424422897, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 9525612, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9492743, "index_size": 20471, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11333, "raw_key_size": 112202, "raw_average_key_size": 24, "raw_value_size": 9408392, "raw_average_value_size": 2084, "num_data_blocks": 864, "num_entries": 4513, "num_filter_entries": 4513, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397158, "oldest_key_time": 0, "file_creation_time": 1764398424, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.423327) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 9525612 bytes
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.424565) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 140.0 rd, 132.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 8.7 +0.0 blob) out(9.1 +0.0 blob), read-write-amplify(21.2) write-amplify(10.3) OK, records in: 5030, records dropped: 517 output_compression: NoCompression
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.424582) EVENT_LOG_v1 {"time_micros": 1764398424424574, "job": 14, "event": "compaction_finished", "compaction_time_micros": 72083, "compaction_time_cpu_micros": 20512, "output_level": 6, "num_output_files": 1, "total_output_size": 9525612, "num_input_records": 5030, "num_output_records": 4513, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398424424795, "job": 14, "event": "table_file_deletion", "file_number": 29}
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398424426355, "job": 14, "event": "table_file_deletion", "file_number": 27}
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.350938) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.426400) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.426404) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.426406) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.426407) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:40:24 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:40:24.426409) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:40:24 np0005539510 python3.9[209988]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:25 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:40:25 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:40:25 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:40:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:25.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:25 np0005539510 python3.9[210140]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:40:25 np0005539510 systemd[1]: Reloading.
Nov 29 01:40:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:26.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:26 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:40:26 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:40:26 np0005539510 systemd[1]: Starting Create netns directory...
Nov 29 01:40:26 np0005539510 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 01:40:26 np0005539510 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 01:40:26 np0005539510 systemd[1]: Finished Create netns directory.
Nov 29 01:40:27 np0005539510 python3.9[210334]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:27.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:28.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:28 np0005539510 python3.9[210486]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:40:29 np0005539510 python3.9[210610]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398428.054028-1279-122043633559972/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:29.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:30.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:30 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:40:30 np0005539510 python3.9[210813]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:31 np0005539510 python3.9[210965]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:40:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:31.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:32.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:32 np0005539510 python3.9[211088]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398431.1757958-1353-233332167031998/.source.json _original_basename=.qy6x84y7 follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:33 np0005539510 python3.9[211241]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:33 np0005539510 systemd[1]: virtqemud.service: Deactivated successfully.
Nov 29 01:40:33 np0005539510 systemd[1]: virtsecretd.service: Deactivated successfully.
Nov 29 01:40:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:33.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:34.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:35 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:40:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:35.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:36.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:37 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:40:37 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:40:37 np0005539510 python3.9[211722]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Nov 29 01:40:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:37.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:38.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:38 np0005539510 python3.9[211874]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 01:40:39 np0005539510 python3.9[212027]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 29 01:40:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:39.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:40.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:40 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:40:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:41.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:42 np0005539510 python3[212207]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 01:40:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:42.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:43 np0005539510 podman[212220]: 2025-11-29 06:40:43.243714791 +0000 UTC m=+1.154205205 image pull f275b8d168f7f57f31e3da49224019f39f95c80a833f083696a964527b07b54f quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 29 01:40:43 np0005539510 podman[212278]: 2025-11-29 06:40:43.366130039 +0000 UTC m=+0.043803887 container create d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, config_id=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:40:43 np0005539510 podman[212278]: 2025-11-29 06:40:43.341922429 +0000 UTC m=+0.019596307 image pull f275b8d168f7f57f31e3da49224019f39f95c80a833f083696a964527b07b54f quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 29 01:40:43 np0005539510 python3[212207]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 29 01:40:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:43.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:44.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:45 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:40:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:45.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:46.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:46 np0005539510 python3.9[212470]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:40:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:47.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:47 np0005539510 python3.9[212624]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:48.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:48 np0005539510 python3.9[212700]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:40:49 np0005539510 python3.9[212852]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764398448.3843925-1617-262441756147483/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:49 np0005539510 python3.9[212928]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:40:49 np0005539510 systemd[1]: Reloading.
Nov 29 01:40:49 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:40:49 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:40:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:49.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:50.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:50 np0005539510 python3.9[213038]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:40:50 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:40:51 np0005539510 systemd[1]: Reloading.
Nov 29 01:40:51 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:40:51 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:40:51 np0005539510 podman[213092]: 2025-11-29 06:40:51.762633116 +0000 UTC m=+0.162369952 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:40:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:51.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:51 np0005539510 systemd[1]: Starting multipathd container...
Nov 29 01:40:52 np0005539510 systemd[1]: Started libcrun container.
Nov 29 01:40:52 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef36de37cfcbb612982f0f9bc5265315d5497549432bf65b48c9ef13850f00ea/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 29 01:40:52 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef36de37cfcbb612982f0f9bc5265315d5497549432bf65b48c9ef13850f00ea/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 29 01:40:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:52.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:52 np0005539510 systemd[1]: Started /usr/bin/podman healthcheck run d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0.
Nov 29 01:40:52 np0005539510 podman[213156]: 2025-11-29 06:40:52.103205934 +0000 UTC m=+0.113437408 container init d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Nov 29 01:40:52 np0005539510 multipathd[213172]: + sudo -E kolla_set_configs
Nov 29 01:40:52 np0005539510 podman[213156]: 2025-11-29 06:40:52.138159363 +0000 UTC m=+0.148390747 container start d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:40:52 np0005539510 podman[213156]: multipathd
Nov 29 01:40:52 np0005539510 systemd[1]: Started multipathd container.
Nov 29 01:40:52 np0005539510 podman[213175]: 2025-11-29 06:40:52.15963911 +0000 UTC m=+0.064132634 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 29 01:40:52 np0005539510 multipathd[213172]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 01:40:52 np0005539510 multipathd[213172]: INFO:__main__:Validating config file
Nov 29 01:40:52 np0005539510 multipathd[213172]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 01:40:52 np0005539510 multipathd[213172]: INFO:__main__:Writing out command to execute
Nov 29 01:40:52 np0005539510 multipathd[213172]: ++ cat /run_command
Nov 29 01:40:52 np0005539510 multipathd[213172]: + CMD='/usr/sbin/multipathd -d'
Nov 29 01:40:52 np0005539510 multipathd[213172]: + ARGS=
Nov 29 01:40:52 np0005539510 multipathd[213172]: + sudo kolla_copy_cacerts
Nov 29 01:40:52 np0005539510 podman[213192]: 2025-11-29 06:40:52.20765212 +0000 UTC m=+0.058142813 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 01:40:52 np0005539510 multipathd[213172]: + [[ ! -n '' ]]
Nov 29 01:40:52 np0005539510 multipathd[213172]: + . kolla_extend_start
Nov 29 01:40:52 np0005539510 multipathd[213172]: Running command: '/usr/sbin/multipathd -d'
Nov 29 01:40:52 np0005539510 multipathd[213172]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 29 01:40:52 np0005539510 multipathd[213172]: + umask 0022
Nov 29 01:40:52 np0005539510 multipathd[213172]: + exec /usr/sbin/multipathd -d
Nov 29 01:40:52 np0005539510 systemd[1]: d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0-653d29d05bd51575.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 01:40:52 np0005539510 systemd[1]: d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0-653d29d05bd51575.service: Failed with result 'exit-code'.
Nov 29 01:40:52 np0005539510 multipathd[213172]: 3920.880617 | --------start up--------
Nov 29 01:40:52 np0005539510 multipathd[213172]: 3920.880638 | read /etc/multipath.conf
Nov 29 01:40:52 np0005539510 multipathd[213172]: 3920.885709 | path checkers start up
Nov 29 01:40:52 np0005539510 python3.9[213379]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:40:53 np0005539510 python3.9[213533]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:40:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:53.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:54.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:54 np0005539510 python3.9[213699]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:40:54 np0005539510 systemd[1]: Stopping multipathd container...
Nov 29 01:40:54 np0005539510 multipathd[213172]: 3923.480283 | exit (signal)
Nov 29 01:40:54 np0005539510 multipathd[213172]: 3923.480976 | --------shut down-------
Nov 29 01:40:54 np0005539510 systemd[1]: libpod-d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0.scope: Deactivated successfully.
Nov 29 01:40:54 np0005539510 conmon[213172]: conmon d45765539066b12c036d <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0.scope/container/memory.events
Nov 29 01:40:54 np0005539510 podman[213703]: 2025-11-29 06:40:54.855345293 +0000 UTC m=+0.066770675 container died d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:40:54 np0005539510 systemd[1]: d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0-653d29d05bd51575.timer: Deactivated successfully.
Nov 29 01:40:54 np0005539510 systemd[1]: Stopped /usr/bin/podman healthcheck run d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0.
Nov 29 01:40:54 np0005539510 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:40:54 np0005539510 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0-userdata-shm.mount: Deactivated successfully.
Nov 29 01:40:54 np0005539510 systemd[1]: var-lib-containers-storage-overlay-ef36de37cfcbb612982f0f9bc5265315d5497549432bf65b48c9ef13850f00ea-merged.mount: Deactivated successfully.
Nov 29 01:40:55 np0005539510 podman[213703]: 2025-11-29 06:40:55.304625461 +0000 UTC m=+0.516050883 container cleanup d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 29 01:40:55 np0005539510 podman[213703]: multipathd
Nov 29 01:40:55 np0005539510 podman[213731]: multipathd
Nov 29 01:40:55 np0005539510 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Nov 29 01:40:55 np0005539510 systemd[1]: Stopped multipathd container.
Nov 29 01:40:55 np0005539510 systemd[1]: Starting multipathd container...
Nov 29 01:40:55 np0005539510 systemd[1]: Started libcrun container.
Nov 29 01:40:55 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef36de37cfcbb612982f0f9bc5265315d5497549432bf65b48c9ef13850f00ea/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 29 01:40:55 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef36de37cfcbb612982f0f9bc5265315d5497549432bf65b48c9ef13850f00ea/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 29 01:40:55 np0005539510 systemd[1]: Started /usr/bin/podman healthcheck run d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0.
Nov 29 01:40:55 np0005539510 podman[213744]: 2025-11-29 06:40:55.504448918 +0000 UTC m=+0.098471956 container init d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 01:40:55 np0005539510 multipathd[213759]: + sudo -E kolla_set_configs
Nov 29 01:40:55 np0005539510 podman[213744]: 2025-11-29 06:40:55.537300381 +0000 UTC m=+0.131323389 container start d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS)
Nov 29 01:40:55 np0005539510 podman[213744]: multipathd
Nov 29 01:40:55 np0005539510 systemd[1]: Started multipathd container.
Nov 29 01:40:55 np0005539510 multipathd[213759]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 01:40:55 np0005539510 multipathd[213759]: INFO:__main__:Validating config file
Nov 29 01:40:55 np0005539510 multipathd[213759]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 01:40:55 np0005539510 multipathd[213759]: INFO:__main__:Writing out command to execute
Nov 29 01:40:55 np0005539510 multipathd[213759]: ++ cat /run_command
Nov 29 01:40:55 np0005539510 multipathd[213759]: + CMD='/usr/sbin/multipathd -d'
Nov 29 01:40:55 np0005539510 multipathd[213759]: + ARGS=
Nov 29 01:40:55 np0005539510 multipathd[213759]: + sudo kolla_copy_cacerts
Nov 29 01:40:55 np0005539510 multipathd[213759]: + [[ ! -n '' ]]
Nov 29 01:40:55 np0005539510 multipathd[213759]: + . kolla_extend_start
Nov 29 01:40:55 np0005539510 multipathd[213759]: Running command: '/usr/sbin/multipathd -d'
Nov 29 01:40:55 np0005539510 multipathd[213759]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 29 01:40:55 np0005539510 multipathd[213759]: + umask 0022
Nov 29 01:40:55 np0005539510 multipathd[213759]: + exec /usr/sbin/multipathd -d
Nov 29 01:40:55 np0005539510 multipathd[213759]: 3924.311146 | --------start up--------
Nov 29 01:40:55 np0005539510 multipathd[213759]: 3924.311194 | read /etc/multipath.conf
Nov 29 01:40:55 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:40:55 np0005539510 podman[213766]: 2025-11-29 06:40:55.655789244 +0000 UTC m=+0.109426891 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 01:40:55 np0005539510 multipathd[213759]: 3924.316137 | path checkers start up
Nov 29 01:40:55 np0005539510 systemd[1]: d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0-62a33ed563dbe54a.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 01:40:55 np0005539510 systemd[1]: d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0-62a33ed563dbe54a.service: Failed with result 'exit-code'.
Nov 29 01:40:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:55.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:40:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:56.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:40:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:57.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:40:58.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:40:58 np0005539510 python3.9[213951]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:40:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:40:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:40:59.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:00 np0005539510 python3.9[214103]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 29 01:41:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:00.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:00 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:41:00 np0005539510 python3.9[214256]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Nov 29 01:41:00 np0005539510 kernel: Key type psk registered
Nov 29 01:41:01 np0005539510 python3.9[214419]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:41:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:01.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:41:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:02.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:41:02 np0005539510 python3.9[214542]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398461.2045333-1857-149618396532093/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:03 np0005539510 python3.9[214695]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:03.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:04.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:04 np0005539510 python3.9[214847]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:41:04 np0005539510 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 29 01:41:04 np0005539510 systemd[1]: Stopped Load Kernel Modules.
Nov 29 01:41:04 np0005539510 systemd[1]: Stopping Load Kernel Modules...
Nov 29 01:41:04 np0005539510 systemd[1]: Starting Load Kernel Modules...
Nov 29 01:41:04 np0005539510 systemd[1]: Finished Load Kernel Modules.
Nov 29 01:41:05 np0005539510 python3.9[215004]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:41:05 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:41:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:05.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:41:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:06.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:07.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:08.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:09 np0005539510 systemd[1]: Reloading.
Nov 29 01:41:09 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:41:09 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:41:09 np0005539510 systemd[1]: Reloading.
Nov 29 01:41:09 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:41:09 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:41:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:09.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:41:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:10.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:10 np0005539510 systemd-logind[784]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 29 01:41:10 np0005539510 systemd-logind[784]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 29 01:41:10 np0005539510 lvm[215119]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 29 01:41:10 np0005539510 lvm[215119]: VG ceph_vg0 finished
Nov 29 01:41:10 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:41:10 np0005539510 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 01:41:10 np0005539510 systemd[1]: Starting man-db-cache-update.service...
Nov 29 01:41:10 np0005539510 systemd[1]: Reloading.
Nov 29 01:41:10 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:41:10 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:41:11 np0005539510 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 01:41:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:11.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:12.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:13 np0005539510 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 01:41:13 np0005539510 systemd[1]: Finished man-db-cache-update.service.
Nov 29 01:41:13 np0005539510 systemd[1]: man-db-cache-update.service: Consumed 1.738s CPU time.
Nov 29 01:41:13 np0005539510 systemd[1]: run-re34cc9af837248f7a074ad6d47e83fba.service: Deactivated successfully.
Nov 29 01:41:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:13.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:13 np0005539510 python3.9[216513]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:41:13 np0005539510 systemd[1]: Stopping Open-iSCSI...
Nov 29 01:41:13 np0005539510 iscsid[204197]: iscsid shutting down.
Nov 29 01:41:13 np0005539510 systemd[1]: iscsid.service: Deactivated successfully.
Nov 29 01:41:13 np0005539510 systemd[1]: Stopped Open-iSCSI.
Nov 29 01:41:13 np0005539510 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 29 01:41:13 np0005539510 systemd[1]: Starting Open-iSCSI...
Nov 29 01:41:13 np0005539510 systemd[1]: Started Open-iSCSI.
Nov 29 01:41:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:14.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:15 np0005539510 python3.9[216669]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:41:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:41:15.132 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:41:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:41:15.133 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:41:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:41:15.133 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:41:15 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:41:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:15.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:16 np0005539510 python3.9[216825]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:16.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:17 np0005539510 python3.9[216978]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:41:17 np0005539510 systemd[1]: Reloading.
Nov 29 01:41:17 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:41:17 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:41:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:17.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:18.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:18 np0005539510 python3.9[217162]: ansible-ansible.builtin.service_facts Invoked
Nov 29 01:41:18 np0005539510 network[217180]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:41:18 np0005539510 network[217181]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:41:18 np0005539510 network[217182]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:41:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:19.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:20.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:20 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:41:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:41:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:21.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:41:22 np0005539510 podman[217269]: 2025-11-29 06:41:22.093336095 +0000 UTC m=+0.094475376 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Nov 29 01:41:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:22.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:22 np0005539510 podman[217306]: 2025-11-29 06:41:22.254966832 +0000 UTC m=+0.048799851 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 29 01:41:23 np0005539510 python3.9[217504]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:41:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:23.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:24.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:24 np0005539510 python3.9[217657]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:41:25 np0005539510 python3.9[217811]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:41:25 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:41:25 np0005539510 podman[217965]: 2025-11-29 06:41:25.889760813 +0000 UTC m=+0.059776265 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 01:41:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:25.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:25 np0005539510 python3.9[217964]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:41:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:26.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:26 np0005539510 python3.9[218139]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:41:27 np0005539510 python3.9[218292]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:41:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:27.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:41:28 np0005539510 python3.9[218445]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:41:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:28.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:28 np0005539510 python3.9[218599]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:41:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:29.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:41:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:30.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:30 np0005539510 python3.9[218753]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:30 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:41:31 np0005539510 python3.9[218955]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:31.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:32 np0005539510 python3.9[219107]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:32.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:41:32 np0005539510 python3.9[219260]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:33 np0005539510 python3.9[219412]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:33 np0005539510 python3.9[219564]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:41:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:33.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:41:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:34.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:34 np0005539510 python3.9[219716]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:35 np0005539510 python3.9[219869]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:35 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:41:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:35.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:36.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:37 np0005539510 podman[220143]: 2025-11-29 06:41:37.10774482 +0000 UTC m=+0.059751704 container exec 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_REF=reef)
Nov 29 01:41:37 np0005539510 podman[220143]: 2025-11-29 06:41:37.220358652 +0000 UTC m=+0.172365546 container exec_died 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 01:41:37 np0005539510 python3.9[220215]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:37 np0005539510 podman[220500]: 2025-11-29 06:41:37.816370754 +0000 UTC m=+0.058091019 container exec e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 01:41:37 np0005539510 podman[220500]: 2025-11-29 06:41:37.826350432 +0000 UTC m=+0.068070707 container exec_died e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 01:41:37 np0005539510 python3.9[220513]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:37.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:38 np0005539510 podman[220566]: 2025-11-29 06:41:38.018485358 +0000 UTC m=+0.042828081 container exec d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, release=1793, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, name=keepalived, description=keepalived for Ceph, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., io.buildah.version=1.28.2, architecture=x86_64)
Nov 29 01:41:38 np0005539510 podman[220566]: 2025-11-29 06:41:38.032340219 +0000 UTC m=+0.056682942 container exec_died d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, description=keepalived for Ceph, io.buildah.version=1.28.2, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, release=1793, vcs-type=git, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public)
Nov 29 01:41:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:38.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:41:38 np0005539510 python3.9[220824]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:39 np0005539510 python3.9[221032]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:39 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:41:39 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:41:39 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Nov 29 01:41:39 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:41:39 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:41:39 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:41:39 np0005539510 python3.9[221184]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:39.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:40.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:40 np0005539510 python3.9[221338]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:40 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:41:41 np0005539510 python3.9[221491]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:41 np0005539510 python3.9[221643]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:41.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:42.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:43 np0005539510 python3.9[221796]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:41:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:43.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:44.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:41:44 np0005539510 python3.9[221948]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 01:41:45 np0005539510 python3.9[222101]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:41:45 np0005539510 systemd[1]: Reloading.
Nov 29 01:41:45 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:41:45 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:41:45 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:41:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:45.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:46.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:46 np0005539510 python3.9[222288]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:41:47 np0005539510 python3.9[222491]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:41:47 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:41:47 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:41:47 np0005539510 python3.9[222644]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:41:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:47.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:48.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:48 np0005539510 python3.9[222797]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:41:49 np0005539510 python3.9[222951]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:41:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:49.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:50 np0005539510 python3.9[223104]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:41:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:50.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:50 np0005539510 python3.9[223257]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:41:50 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:41:51 np0005539510 python3.9[223461]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:41:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:51.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:41:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:52.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:52 np0005539510 podman[223489]: 2025-11-29 06:41:52.911207566 +0000 UTC m=+0.075625920 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 01:41:52 np0005539510 podman[223488]: 2025-11-29 06:41:52.96464473 +0000 UTC m=+0.128932651 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 29 01:41:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:41:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:53.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:41:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:54.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:54 np0005539510 python3.9[223659]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:41:55 np0005539510 python3.9[223811]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:41:55 np0005539510 python3.9[223963]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:41:55 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:41:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:55.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:41:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:56.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:56 np0005539510 podman[224088]: 2025-11-29 06:41:56.74361703 +0000 UTC m=+0.056373673 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 01:41:56 np0005539510 python3.9[224136]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:41:57 np0005539510 python3.9[224288]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:41:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:57.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:58 np0005539510 python3.9[224440]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:41:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:41:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:41:58.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:41:58 np0005539510 python3.9[224593]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:41:59 np0005539510 python3.9[224745]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:41:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:41:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:41:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:41:59.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:42:00 np0005539510 python3.9[224897]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:42:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:42:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:00.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:42:00 np0005539510 python3.9[225050]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:42:00 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:42:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:01.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:02.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:03.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:42:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:04.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:42:05 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:42:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:05.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:06.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:42:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:08.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:42:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:42:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:08.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:42:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:42:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:10.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:42:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:10.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:10 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:42:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:42:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:12.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:42:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:42:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:12.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:42:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:14.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:42:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:14.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:42:15 np0005539510 python3.9[225259]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Nov 29 01:42:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:42:15.133 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:42:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:42:15.133 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:42:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:42:15.133 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:42:15 np0005539510 python3.9[225412]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 01:42:15 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:42:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:42:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:16.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:42:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:16.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:16 np0005539510 python3.9[225571]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 29 01:42:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:18.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:42:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:18.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:42:18 np0005539510 systemd-logind[784]: New session 51 of user zuul.
Nov 29 01:42:18 np0005539510 systemd[1]: Started Session 51 of User zuul.
Nov 29 01:42:18 np0005539510 systemd[1]: session-51.scope: Deactivated successfully.
Nov 29 01:42:18 np0005539510 systemd-logind[784]: Session 51 logged out. Waiting for processes to exit.
Nov 29 01:42:18 np0005539510 systemd-logind[784]: Removed session 51.
Nov 29 01:42:19 np0005539510 python3.9[225758]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:20.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:20 np0005539510 python3.9[225879]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398539.0540826-3441-30724127400186/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:42:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:20.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:20 np0005539510 python3.9[226030]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:20 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:42:21 np0005539510 python3.9[226106]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:42:21 np0005539510 python3.9[226256]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:42:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:22.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:42:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:42:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:22.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:42:22 np0005539510 python3.9[226377]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398541.3067589-3441-233299418078229/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:42:22 np0005539510 python3.9[226528]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:23 np0005539510 podman[226624]: 2025-11-29 06:42:23.300745564 +0000 UTC m=+0.053415256 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 01:42:23 np0005539510 podman[226623]: 2025-11-29 06:42:23.330241898 +0000 UTC m=+0.082936670 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 29 01:42:23 np0005539510 python3.9[226678]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398542.3932009-3441-151088684425609/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=d01cc1b48d783e4ed08d12bb4d0a107aba230a69 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:42:24 np0005539510 python3.9[226843]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:24.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:24.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:24 np0005539510 python3.9[226964]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398543.5901532-3441-170291300440708/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:42:25 np0005539510 python3.9[227115]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:25 np0005539510 python3.9[227236]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398544.664049-3441-201786990003353/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:42:25 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:42:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:26.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:26.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:26 np0005539510 podman[227262]: 2025-11-29 06:42:26.88489776 +0000 UTC m=+0.052026561 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible)
Nov 29 01:42:27 np0005539510 python3.9[227409]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:28.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:28.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:28 np0005539510 python3.9[227561]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:29 np0005539510 python3.9[227714]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:42:30 np0005539510 python3.9[227866]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:30.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:30.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:30 np0005539510 python3.9[227989]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1764398549.515259-3764-30597739619614/.source _original_basename=.wx6hmnjk follow=False checksum=80d66b7884a0d69a26deb9106b95d887b7961548 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Nov 29 01:42:30 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:42:31 np0005539510 python3.9[228192]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:42:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:32.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:32.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:32 np0005539510 python3.9[228344]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:32 np0005539510 python3.9[228466]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398551.8989372-3840-165265584278883/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:42:33 np0005539510 python3.9[228616]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:34.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:34 np0005539510 python3.9[228737]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398553.2663946-3885-278237524724234/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:42:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:42:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:34.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:42:35 np0005539510 python3.9[228890]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Nov 29 01:42:35 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:42:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:42:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:36.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:42:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:36.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:36 np0005539510 python3.9[229042]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 01:42:37 np0005539510 python3[229195]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 01:42:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:38.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:38.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:40.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:40.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:40 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:42:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:42.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:42.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:44.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:44.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:45 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:42:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:46.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:46.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:42:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:48.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:42:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:48.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:49 np0005539510 podman[229208]: 2025-11-29 06:42:49.743085918 +0000 UTC m=+12.260450199 image pull b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 29 01:42:49 np0005539510 podman[229422]: 2025-11-29 06:42:49.858985809 +0000 UTC m=+0.026630821 image pull b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 29 01:42:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:50.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:42:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:50.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:42:50 np0005539510 podman[229422]: 2025-11-29 06:42:50.466998886 +0000 UTC m=+0.634643848 container create d8f52aa1a5a9bded18911397b98d2bd5886244cb8e06988f3ff513cabeca2db2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, container_name=nova_compute_init, tcib_managed=true)
Nov 29 01:42:50 np0005539510 python3[229195]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Nov 29 01:42:50 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:42:51 np0005539510 python3.9[229613]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:42:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:52.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:52.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:52 np0005539510 python3.9[229818]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Nov 29 01:42:53 np0005539510 podman[229943]: 2025-11-29 06:42:53.858060189 +0000 UTC m=+0.053782585 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:42:53 np0005539510 podman[229942]: 2025-11-29 06:42:53.884204437 +0000 UTC m=+0.082558370 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:42:54 np0005539510 python3.9[230007]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 01:42:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:54.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:42:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:54.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:42:54 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:42:54 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:42:55 np0005539510 python3[230168]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 01:42:55 np0005539510 podman[230202]: 2025-11-29 06:42:55.342889601 +0000 UTC m=+0.056968587 container create e1b49bbceae24f0cb59155d974e14d2fdbe66d656c015d4c0aaa99f38eb23562 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=nova_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3)
Nov 29 01:42:55 np0005539510 podman[230202]: 2025-11-29 06:42:55.308537593 +0000 UTC m=+0.022616579 image pull b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 29 01:42:55 np0005539510 python3[230168]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Nov 29 01:42:55 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:42:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:42:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:56.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:42:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:56.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:56 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:42:57 np0005539510 podman[230365]: 2025-11-29 06:42:57.211762765 +0000 UTC m=+0.061515863 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 01:42:57 np0005539510 python3.9[230413]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:42:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:42:58.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:58 np0005539510 python3.9[230567]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:42:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:42:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:42:58.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:42:58 np0005539510 python3.9[230719]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764398578.3112159-4160-270685850365797/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:59 np0005539510 python3.9[230795]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:42:59 np0005539510 systemd[1]: Reloading.
Nov 29 01:42:59 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:42:59 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:43:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:00.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:00.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:00 np0005539510 python3.9[230905]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:43:00 np0005539510 systemd[1]: Reloading.
Nov 29 01:43:00 np0005539510 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:43:00 np0005539510 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:43:00 np0005539510 systemd[1]: Starting nova_compute container...
Nov 29 01:43:00 np0005539510 systemd[1]: Started libcrun container.
Nov 29 01:43:00 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0541d2257bfa6bdc6ff93b1f3fca62df03cb2f349676aeee8dc393c9423a8e52/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:00 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0541d2257bfa6bdc6ff93b1f3fca62df03cb2f349676aeee8dc393c9423a8e52/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:00 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0541d2257bfa6bdc6ff93b1f3fca62df03cb2f349676aeee8dc393c9423a8e52/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:00 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0541d2257bfa6bdc6ff93b1f3fca62df03cb2f349676aeee8dc393c9423a8e52/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:00 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0541d2257bfa6bdc6ff93b1f3fca62df03cb2f349676aeee8dc393c9423a8e52/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:00 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:43:00 np0005539510 podman[230946]: 2025-11-29 06:43:00.911966976 +0000 UTC m=+0.101446393 container init e1b49bbceae24f0cb59155d974e14d2fdbe66d656c015d4c0aaa99f38eb23562 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 01:43:00 np0005539510 podman[230946]: 2025-11-29 06:43:00.924469445 +0000 UTC m=+0.113948802 container start e1b49bbceae24f0cb59155d974e14d2fdbe66d656c015d4c0aaa99f38eb23562 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 01:43:00 np0005539510 podman[230946]: nova_compute
Nov 29 01:43:00 np0005539510 nova_compute[230961]: + sudo -E kolla_set_configs
Nov 29 01:43:00 np0005539510 systemd[1]: Started nova_compute container.
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Validating config file
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Copying service configuration files
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Deleting /etc/ceph
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Creating directory /etc/ceph
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Setting permission for /etc/ceph
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Writing out command to execute
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:43:01 np0005539510 nova_compute[230961]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 29 01:43:01 np0005539510 nova_compute[230961]: ++ cat /run_command
Nov 29 01:43:01 np0005539510 nova_compute[230961]: + CMD=nova-compute
Nov 29 01:43:01 np0005539510 nova_compute[230961]: + ARGS=
Nov 29 01:43:01 np0005539510 nova_compute[230961]: + sudo kolla_copy_cacerts
Nov 29 01:43:01 np0005539510 nova_compute[230961]: + [[ ! -n '' ]]
Nov 29 01:43:01 np0005539510 nova_compute[230961]: + . kolla_extend_start
Nov 29 01:43:01 np0005539510 nova_compute[230961]: Running command: 'nova-compute'
Nov 29 01:43:01 np0005539510 nova_compute[230961]: + echo 'Running command: '\''nova-compute'\'''
Nov 29 01:43:01 np0005539510 nova_compute[230961]: + umask 0022
Nov 29 01:43:01 np0005539510 nova_compute[230961]: + exec nova-compute
Nov 29 01:43:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:43:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:02.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:43:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000025s ======
Nov 29 01:43:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:02.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Nov 29 01:43:02 np0005539510 python3.9[231124]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:43:03 np0005539510 nova_compute[230961]: 2025-11-29 06:43:03.108 230965 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 29 01:43:03 np0005539510 nova_compute[230961]: 2025-11-29 06:43:03.109 230965 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 29 01:43:03 np0005539510 nova_compute[230961]: 2025-11-29 06:43:03.109 230965 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 29 01:43:03 np0005539510 nova_compute[230961]: 2025-11-29 06:43:03.109 230965 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Nov 29 01:43:03 np0005539510 nova_compute[230961]: 2025-11-29 06:43:03.240 230965 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:43:03 np0005539510 nova_compute[230961]: 2025-11-29 06:43:03.263 230965 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:43:03 np0005539510 nova_compute[230961]: 2025-11-29 06:43:03.264 230965 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 29 01:43:03 np0005539510 python3.9[231328]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:43:03 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:43:03 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:43:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:04.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.218 230965 INFO nova.virt.driver [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Nov 29 01:43:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:04.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.342 230965 INFO nova.compute.provider_config [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.375 230965 DEBUG oslo_concurrency.lockutils [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.376 230965 DEBUG oslo_concurrency.lockutils [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.376 230965 DEBUG oslo_concurrency.lockutils [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.376 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.377 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.377 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.377 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.377 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.378 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.378 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.378 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.378 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.379 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.379 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.379 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.379 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.380 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.380 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.380 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.380 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.381 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.381 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.381 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.381 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.382 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.382 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.382 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.382 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.383 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.383 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.383 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.384 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.384 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.384 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.384 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.385 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.385 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.385 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.385 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.386 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.386 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.386 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.386 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.387 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.387 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.387 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.387 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.388 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.388 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.388 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.388 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.389 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.389 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.389 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.389 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.390 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.390 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.390 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.390 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.391 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.391 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.391 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.391 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.392 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.392 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.392 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.392 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.393 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.393 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.393 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.393 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.394 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.394 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.394 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.394 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.395 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.395 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.395 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.396 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.396 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.396 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.396 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.397 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.397 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.397 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.397 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.398 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.398 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.398 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.398 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.399 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.399 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.399 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.399 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.400 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.400 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.400 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.400 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.401 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.401 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.401 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.402 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.402 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.402 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.403 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.403 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.403 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.403 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.404 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.404 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.404 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.405 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.405 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.405 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.405 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.406 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.406 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.406 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.407 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.407 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.407 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.408 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.408 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.408 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.409 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.409 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.409 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.410 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.410 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.410 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.410 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.411 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.411 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.411 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.411 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.412 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.412 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.412 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.413 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.413 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.413 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.414 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.414 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.414 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.414 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.415 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.415 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.415 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.415 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.416 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.416 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.416 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.417 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.417 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.417 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.418 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.418 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.418 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.418 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.419 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.419 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.419 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.420 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.420 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.420 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.421 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.421 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.421 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.422 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.422 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.422 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.423 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.423 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.423 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.423 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.424 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.424 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.424 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.424 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.424 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.424 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.425 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.425 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.425 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.425 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.425 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.426 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.426 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.426 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.426 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.426 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.426 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.427 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.427 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.427 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.427 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.427 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.428 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.428 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.428 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.428 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.428 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.428 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.428 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.429 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.429 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.429 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.429 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.429 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.429 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.429 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.430 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.430 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.430 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.430 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.430 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.431 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.431 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.431 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.431 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.431 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.431 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.431 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.432 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.432 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.432 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.432 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.432 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.432 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.432 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.433 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.433 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.433 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.433 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.433 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.433 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.433 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.434 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.434 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.434 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.434 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.434 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.434 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.435 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.435 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.435 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.435 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.435 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.435 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.435 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.436 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.436 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.436 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.436 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.436 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.436 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.436 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.437 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.437 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.437 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.437 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.437 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.438 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.438 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.438 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.438 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.438 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.439 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.439 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.439 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.439 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.439 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.440 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.440 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.440 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.440 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.440 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.441 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.441 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.441 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.441 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.441 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.442 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.442 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.442 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.442 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.442 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.443 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.443 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.443 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.443 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.443 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.444 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.444 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.444 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.444 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.444 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.445 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.445 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.445 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.445 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.445 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.446 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.446 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.446 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.446 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.446 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.447 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.447 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.447 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.447 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.447 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.448 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.448 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.448 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.448 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.448 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.449 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.449 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.449 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.449 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.449 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.450 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.450 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.450 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.450 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.450 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.451 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.451 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.451 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.451 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.451 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.452 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.452 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.452 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.452 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.452 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.452 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.453 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.453 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.453 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.453 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.453 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.454 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.454 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.454 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.454 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.454 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.455 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.455 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.455 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.455 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.455 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.456 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.456 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.456 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.456 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.456 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.457 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.457 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.457 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.457 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.458 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.458 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.458 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.458 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.458 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.459 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.459 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.459 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.459 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.459 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.460 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.460 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.460 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.460 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.460 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.460 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.461 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.461 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.461 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.461 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.461 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.461 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.462 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.462 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.462 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.462 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.462 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.463 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.463 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.463 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.463 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.463 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.464 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.464 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.464 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.464 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.464 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.465 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.465 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.465 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.465 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.465 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.465 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.466 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.466 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.466 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.466 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.466 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.467 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.467 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.467 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.467 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.467 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.468 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.468 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.468 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.468 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.468 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.468 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.468 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.469 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.469 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.469 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.469 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.469 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.470 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.470 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.470 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.470 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.470 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.470 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.471 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.471 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.471 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.471 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.471 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.471 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.472 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.472 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.472 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.472 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.472 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.473 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.473 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.473 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.473 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.473 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.473 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.474 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.474 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.474 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.474 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.474 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.474 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.475 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.475 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.475 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.475 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.475 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.475 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.476 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.476 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.476 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.476 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.476 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.476 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.477 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.477 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.477 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.477 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.477 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.477 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.478 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.478 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.478 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.478 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.478 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.479 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.479 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.479 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.479 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.479 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.480 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.480 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.480 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.480 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.480 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.481 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.481 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.481 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.481 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.481 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.482 230965 WARNING oslo_config.cfg [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 29 01:43:04 np0005539510 nova_compute[230961]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 29 01:43:04 np0005539510 nova_compute[230961]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 29 01:43:04 np0005539510 nova_compute[230961]: and ``live_migration_inbound_addr`` respectively.
Nov 29 01:43:04 np0005539510 nova_compute[230961]: ).  Its value may be silently ignored in the future.#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.482 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.482 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.482 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.483 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.483 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.483 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.483 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.483 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.484 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.484 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.484 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.484 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.484 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.485 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.485 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.485 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.485 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.486 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.486 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.rbd_secret_uuid        = 336ec58c-893b-528f-a0c1-6ed1196bc047 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.486 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.486 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.486 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.487 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.487 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.487 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.487 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.487 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.488 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.488 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.488 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.488 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.488 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.489 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.489 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.489 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.489 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.489 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.490 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.490 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.490 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.490 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.491 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.491 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.491 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.491 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.491 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.491 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.492 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.492 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.492 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.492 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.492 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.493 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.493 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.493 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.493 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.493 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.494 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.494 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.494 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.494 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.494 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.495 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.495 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.495 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.495 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.495 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.495 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.496 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.496 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.496 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.496 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.496 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.497 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.497 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.497 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.497 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.498 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.498 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.498 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.498 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.499 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.499 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.499 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.499 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.500 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.500 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.500 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.500 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.500 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.500 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.501 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.501 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.501 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.501 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.502 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.502 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.502 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.502 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.502 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.502 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.503 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.503 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.503 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.503 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.503 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.504 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.504 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.504 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.504 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.504 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.504 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.505 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.505 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.505 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.505 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.505 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.506 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.506 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.506 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.506 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.506 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.506 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.506 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.507 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.507 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.507 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.507 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.507 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.507 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.507 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.508 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.508 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.508 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.508 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.508 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.508 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.509 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.509 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.509 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.509 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.509 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.509 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.510 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.510 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.510 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.510 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.510 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.510 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.511 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.511 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.511 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.511 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.511 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.511 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.511 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.512 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.512 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.512 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.512 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.512 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.512 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.512 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.513 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.513 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.513 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.513 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.513 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.513 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.513 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.514 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.514 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.514 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.514 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.514 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.514 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.514 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.515 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.515 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.515 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.515 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.515 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.515 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.515 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.516 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.516 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.516 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.516 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.516 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.516 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.516 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.517 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.517 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.517 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.517 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.517 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.517 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.518 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.518 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.518 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.518 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.518 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.518 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.519 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.519 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.519 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.519 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.519 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.519 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.519 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.520 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.520 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.520 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.520 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.520 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.520 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.520 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.521 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.521 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.521 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.521 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.521 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.521 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.521 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.522 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.522 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.522 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.522 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.522 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.522 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.522 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.523 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.523 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.523 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.523 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.523 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.523 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.523 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.524 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.524 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.524 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.524 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.524 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.524 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.524 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.525 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.525 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.525 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.525 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.525 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.525 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.526 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.526 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.526 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.526 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.526 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.526 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.526 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.527 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.527 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.527 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.527 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.527 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.527 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.527 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.528 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.528 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.528 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.528 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.528 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.528 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.528 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.529 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.529 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.529 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.529 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.529 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.529 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.529 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.530 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.530 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.530 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.530 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.530 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.530 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.530 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.530 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.531 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.531 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.531 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.531 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.531 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.531 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.532 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.532 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.532 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.532 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.532 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.532 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.532 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.533 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.533 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.533 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.533 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.533 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.533 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.533 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.534 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.534 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.534 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.534 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.534 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.534 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.534 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.535 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.535 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.535 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.535 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.535 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.535 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.536 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.536 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.536 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.536 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.536 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.536 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.536 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.537 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.537 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.537 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.537 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.537 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.537 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.537 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.538 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.538 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.538 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.538 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.538 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.538 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.538 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.539 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.539 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.539 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.539 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.539 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.539 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.539 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.540 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.540 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.540 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.540 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.540 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.540 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.540 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.541 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.541 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.541 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.541 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.541 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.541 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.541 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.542 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.542 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.542 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.542 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.542 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.542 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.542 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.543 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.543 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.543 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.543 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.543 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.543 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.543 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.543 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.544 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.544 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.544 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.544 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.544 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.544 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.544 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.545 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.545 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.545 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.545 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.545 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.545 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.545 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.546 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.546 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.546 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.546 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.546 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.546 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.547 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.547 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.547 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.547 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.547 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.548 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.548 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.548 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.548 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.548 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.548 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.548 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.549 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.549 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.549 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.549 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.549 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.549 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.549 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.550 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.550 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.550 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.550 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.550 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.550 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.551 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.551 230965 DEBUG oslo_service.service [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.552 230965 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.570 230965 DEBUG nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.571 230965 DEBUG nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.571 230965 DEBUG nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.571 230965 DEBUG nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Nov 29 01:43:04 np0005539510 systemd[1]: Starting libvirt QEMU daemon...
Nov 29 01:43:04 np0005539510 systemd[1]: Started libvirt QEMU daemon.
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.637 230965 DEBUG nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f1cdac99d00> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.639 230965 DEBUG nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f1cdac99d00> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.640 230965 INFO nova.virt.libvirt.driver [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Connection event '1' reason 'None'#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.661 230965 WARNING nova.virt.libvirt.driver [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Cannot update service status on host "compute-2.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Nov 29 01:43:04 np0005539510 nova_compute[230961]: 2025-11-29 06:43:04.661 230965 DEBUG nova.virt.libvirt.volume.mount [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Nov 29 01:43:04 np0005539510 python3.9[231479]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:43:05 np0005539510 nova_compute[230961]: 2025-11-29 06:43:05.458 230965 INFO nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Libvirt host capabilities <capabilities>
Nov 29 01:43:05 np0005539510 nova_compute[230961]: 
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <host>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <uuid>4a1784f4-2c5f-4879-a5f6-acc886e56ebb</uuid>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <cpu>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <arch>x86_64</arch>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model>EPYC-Rome-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <vendor>AMD</vendor>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <microcode version='16777317'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <signature family='23' model='49' stepping='0'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <maxphysaddr mode='emulate' bits='40'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature name='x2apic'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature name='tsc-deadline'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature name='osxsave'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature name='hypervisor'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature name='tsc_adjust'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature name='spec-ctrl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature name='stibp'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature name='arch-capabilities'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature name='ssbd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature name='cmp_legacy'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature name='topoext'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature name='virt-ssbd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature name='lbrv'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature name='tsc-scale'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature name='vmcb-clean'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature name='pause-filter'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature name='pfthreshold'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature name='svme-addr-chk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature name='rdctl-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature name='skip-l1dfl-vmentry'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature name='mds-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature name='pschange-mc-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <pages unit='KiB' size='4'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <pages unit='KiB' size='2048'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <pages unit='KiB' size='1048576'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </cpu>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <power_management>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <suspend_mem/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </power_management>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <iommu support='no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <migration_features>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <live/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <uri_transports>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <uri_transport>tcp</uri_transport>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <uri_transport>rdma</uri_transport>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </uri_transports>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </migration_features>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <topology>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <cells num='1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <cell id='0'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:          <memory unit='KiB'>7864320</memory>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:          <pages unit='KiB' size='4'>1966080</pages>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:          <pages unit='KiB' size='2048'>0</pages>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:          <pages unit='KiB' size='1048576'>0</pages>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:          <distances>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:            <sibling id='0' value='10'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:          </distances>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:          <cpus num='8'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:          </cpus>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        </cell>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </cells>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </topology>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <cache>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </cache>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <secmodel>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model>selinux</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <doi>0</doi>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </secmodel>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <secmodel>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model>dac</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <doi>0</doi>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <baselabel type='kvm'>+107:+107</baselabel>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <baselabel type='qemu'>+107:+107</baselabel>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </secmodel>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  </host>
Nov 29 01:43:05 np0005539510 nova_compute[230961]: 
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <guest>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <os_type>hvm</os_type>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <arch name='i686'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <wordsize>32</wordsize>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <domain type='qemu'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <domain type='kvm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </arch>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <features>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <pae/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <nonpae/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <acpi default='on' toggle='yes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <apic default='on' toggle='no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <cpuselection/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <deviceboot/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <disksnapshot default='on' toggle='no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <externalSnapshot/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </features>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  </guest>
Nov 29 01:43:05 np0005539510 nova_compute[230961]: 
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <guest>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <os_type>hvm</os_type>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <arch name='x86_64'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <wordsize>64</wordsize>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <domain type='qemu'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <domain type='kvm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </arch>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <features>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <acpi default='on' toggle='yes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <apic default='on' toggle='no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <cpuselection/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <deviceboot/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <disksnapshot default='on' toggle='no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <externalSnapshot/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </features>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  </guest>
Nov 29 01:43:05 np0005539510 nova_compute[230961]: 
Nov 29 01:43:05 np0005539510 nova_compute[230961]: </capabilities>
Nov 29 01:43:05 np0005539510 nova_compute[230961]: #033[00m
Nov 29 01:43:05 np0005539510 nova_compute[230961]: 2025-11-29 06:43:05.465 230965 DEBUG nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 29 01:43:05 np0005539510 nova_compute[230961]: 2025-11-29 06:43:05.489 230965 DEBUG nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 29 01:43:05 np0005539510 nova_compute[230961]: <domainCapabilities>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <domain>kvm</domain>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <arch>i686</arch>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <vcpu max='240'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <iothreads supported='yes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <os supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <enum name='firmware'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <loader supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='type'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>rom</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>pflash</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='readonly'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>yes</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>no</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='secure'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>no</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </loader>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  </os>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <cpu>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>on</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>off</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </mode>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <mode name='maximum' supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='maximumMigratable'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>on</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>off</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </mode>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <mode name='host-model' supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <vendor>AMD</vendor>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='x2apic'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='stibp'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='ssbd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='succor'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='ibrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='lbrv'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </mode>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <mode name='custom' supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cascadelake-Server'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cooperlake'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cooperlake-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cooperlake-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Denverton'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mpx'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Denverton-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mpx'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Denverton-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Denverton-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Dhyana-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Genoa'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amd-psfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='auto-ibrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='stibp-always-on'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amd-psfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='auto-ibrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='stibp-always-on'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Milan'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amd-psfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='stibp-always-on'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Rome'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='GraniteRapids'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='prefetchiti'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='prefetchiti'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx10'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx10-128'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx10-256'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx10-512'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='prefetchiti'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell-noTSX'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='IvyBridge'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='IvyBridge-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='IvyBridge-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='KnightsMill'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512er'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512pf'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='KnightsMill-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512er'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512pf'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Opteron_G4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fma4'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xop'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fma4'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xop'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Opteron_G5'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fma4'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tbm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xop'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fma4'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tbm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xop'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='SapphireRapids'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='SierraForest'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cmpccxadd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='SierraForest-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cmpccxadd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Client'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Snowridge'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='core-capability'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mpx'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='split-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Snowridge-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='core-capability'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mpx'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='split-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Snowridge-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='core-capability'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='split-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Snowridge-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='core-capability'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='split-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Snowridge-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='athlon'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnow'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnowext'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='athlon-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnow'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnowext'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='core2duo'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='core2duo-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='coreduo'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='coreduo-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='n270'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='n270-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='phenom'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnow'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnowext'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='phenom-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnow'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnowext'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </mode>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  </cpu>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <memoryBacking supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <enum name='sourceType'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <value>file</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <value>anonymous</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <value>memfd</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  </memoryBacking>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <devices>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <disk supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='diskDevice'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>disk</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>cdrom</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>floppy</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>lun</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='bus'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>ide</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>fdc</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>scsi</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>usb</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>sata</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='model'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio-transitional</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio-non-transitional</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </disk>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <graphics supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='type'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>vnc</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>egl-headless</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>dbus</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </graphics>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <video supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='modelType'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>vga</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>cirrus</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>none</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>bochs</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>ramfb</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </video>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <hostdev supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='mode'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>subsystem</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='startupPolicy'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>default</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>mandatory</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>requisite</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>optional</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='subsysType'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>usb</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>pci</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>scsi</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='capsType'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='pciBackend'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </hostdev>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <rng supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='model'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio-transitional</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio-non-transitional</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='backendModel'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>random</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>egd</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>builtin</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </rng>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <filesystem supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='driverType'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>path</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>handle</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtiofs</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </filesystem>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <tpm supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='model'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>tpm-tis</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>tpm-crb</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='backendModel'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>emulator</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>external</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='backendVersion'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>2.0</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </tpm>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <redirdev supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='bus'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>usb</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </redirdev>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <channel supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='type'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>pty</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>unix</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </channel>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <crypto supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='model'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='type'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>qemu</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='backendModel'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>builtin</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </crypto>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <interface supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='backendType'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>default</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>passt</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </interface>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <panic supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='model'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>isa</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>hyperv</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </panic>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <console supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='type'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>null</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>vc</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>pty</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>dev</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>file</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>pipe</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>stdio</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>udp</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>tcp</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>unix</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>qemu-vdagent</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>dbus</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </console>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  </devices>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <features>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <gic supported='no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <vmcoreinfo supported='yes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <genid supported='yes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <backingStoreInput supported='yes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <backup supported='yes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <async-teardown supported='yes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <ps2 supported='yes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <sev supported='no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <sgx supported='no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <hyperv supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='features'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>relaxed</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>vapic</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>spinlocks</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>vpindex</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>runtime</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>synic</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>stimer</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>reset</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>vendor_id</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>frequencies</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>reenlightenment</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>tlbflush</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>ipi</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>avic</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>emsr_bitmap</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>xmm_input</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <defaults>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <spinlocks>4095</spinlocks>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <stimer_direct>on</stimer_direct>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </defaults>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </hyperv>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <launchSecurity supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='sectype'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>tdx</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </launchSecurity>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  </features>
Nov 29 01:43:05 np0005539510 nova_compute[230961]: </domainCapabilities>
Nov 29 01:43:05 np0005539510 nova_compute[230961]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:43:05 np0005539510 nova_compute[230961]: 2025-11-29 06:43:05.500 230965 DEBUG nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 29 01:43:05 np0005539510 nova_compute[230961]: <domainCapabilities>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <domain>kvm</domain>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <arch>i686</arch>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <vcpu max='4096'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <iothreads supported='yes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <os supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <enum name='firmware'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <loader supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='type'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>rom</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>pflash</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='readonly'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>yes</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>no</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='secure'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>no</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </loader>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  </os>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <cpu>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>on</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>off</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </mode>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <mode name='maximum' supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='maximumMigratable'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>on</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>off</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </mode>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <mode name='host-model' supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <vendor>AMD</vendor>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='x2apic'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='stibp'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='ssbd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='succor'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='ibrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='lbrv'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </mode>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <mode name='custom' supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cascadelake-Server'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cooperlake'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cooperlake-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cooperlake-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Denverton'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mpx'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Denverton-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mpx'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Denverton-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Denverton-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Dhyana-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Genoa'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amd-psfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='auto-ibrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='stibp-always-on'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amd-psfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='auto-ibrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='stibp-always-on'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Milan'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amd-psfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='stibp-always-on'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Rome'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='GraniteRapids'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='prefetchiti'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='prefetchiti'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx10'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx10-128'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx10-256'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx10-512'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='prefetchiti'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell-noTSX'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='IvyBridge'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='IvyBridge-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='IvyBridge-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='KnightsMill'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512er'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512pf'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='KnightsMill-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512er'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512pf'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Opteron_G4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fma4'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xop'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fma4'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xop'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Opteron_G5'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fma4'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tbm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xop'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fma4'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tbm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xop'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='SapphireRapids'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='SierraForest'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cmpccxadd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='SierraForest-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cmpccxadd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Client'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Snowridge'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='core-capability'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mpx'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='split-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Snowridge-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='core-capability'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mpx'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='split-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Snowridge-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='core-capability'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='split-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Snowridge-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='core-capability'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='split-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Snowridge-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='athlon'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnow'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnowext'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='athlon-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnow'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnowext'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='core2duo'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='core2duo-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='coreduo'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='coreduo-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='n270'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='n270-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='phenom'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnow'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnowext'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='phenom-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnow'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnowext'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </mode>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  </cpu>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <memoryBacking supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <enum name='sourceType'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <value>file</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <value>anonymous</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <value>memfd</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  </memoryBacking>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <devices>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <disk supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='diskDevice'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>disk</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>cdrom</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>floppy</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>lun</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='bus'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>fdc</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>scsi</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>usb</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>sata</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='model'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio-transitional</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio-non-transitional</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </disk>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <graphics supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='type'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>vnc</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>egl-headless</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>dbus</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </graphics>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <video supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='modelType'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>vga</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>cirrus</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>none</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>bochs</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>ramfb</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </video>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <hostdev supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='mode'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>subsystem</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='startupPolicy'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>default</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>mandatory</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>requisite</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>optional</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='subsysType'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>usb</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>pci</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>scsi</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='capsType'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='pciBackend'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </hostdev>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <rng supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='model'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio-transitional</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio-non-transitional</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='backendModel'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>random</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>egd</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>builtin</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </rng>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <filesystem supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='driverType'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>path</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>handle</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtiofs</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </filesystem>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <tpm supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='model'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>tpm-tis</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>tpm-crb</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='backendModel'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>emulator</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>external</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='backendVersion'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>2.0</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </tpm>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <redirdev supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='bus'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>usb</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </redirdev>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <channel supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='type'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>pty</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>unix</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </channel>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <crypto supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='model'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='type'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>qemu</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='backendModel'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>builtin</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </crypto>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <interface supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='backendType'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>default</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>passt</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </interface>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <panic supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='model'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>isa</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>hyperv</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </panic>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <console supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='type'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>null</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>vc</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>pty</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>dev</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>file</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>pipe</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>stdio</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>udp</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>tcp</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>unix</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>qemu-vdagent</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>dbus</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </console>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  </devices>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <features>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <gic supported='no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <vmcoreinfo supported='yes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <genid supported='yes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <backingStoreInput supported='yes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <backup supported='yes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <async-teardown supported='yes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <ps2 supported='yes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <sev supported='no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <sgx supported='no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <hyperv supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='features'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>relaxed</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>vapic</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>spinlocks</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>vpindex</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>runtime</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>synic</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>stimer</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>reset</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>vendor_id</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>frequencies</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>reenlightenment</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>tlbflush</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>ipi</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>avic</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>emsr_bitmap</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>xmm_input</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <defaults>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <spinlocks>4095</spinlocks>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <stimer_direct>on</stimer_direct>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </defaults>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </hyperv>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <launchSecurity supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='sectype'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>tdx</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </launchSecurity>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  </features>
Nov 29 01:43:05 np0005539510 nova_compute[230961]: </domainCapabilities>
Nov 29 01:43:05 np0005539510 nova_compute[230961]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:43:05 np0005539510 nova_compute[230961]: 2025-11-29 06:43:05.549 230965 DEBUG nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 29 01:43:05 np0005539510 nova_compute[230961]: 2025-11-29 06:43:05.553 230965 DEBUG nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 29 01:43:05 np0005539510 nova_compute[230961]: <domainCapabilities>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <domain>kvm</domain>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <arch>x86_64</arch>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <vcpu max='240'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <iothreads supported='yes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <os supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <enum name='firmware'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <loader supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='type'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>rom</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>pflash</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='readonly'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>yes</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>no</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='secure'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>no</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </loader>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  </os>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <cpu>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>on</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>off</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </mode>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <mode name='maximum' supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='maximumMigratable'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>on</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>off</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </mode>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <mode name='host-model' supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <vendor>AMD</vendor>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='x2apic'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='stibp'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='ssbd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='succor'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='ibrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='lbrv'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </mode>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <mode name='custom' supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cascadelake-Server'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cooperlake'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cooperlake-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cooperlake-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Denverton'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mpx'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Denverton-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mpx'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Denverton-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Denverton-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Dhyana-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Genoa'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amd-psfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='auto-ibrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='stibp-always-on'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amd-psfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='auto-ibrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='stibp-always-on'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Milan'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amd-psfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='stibp-always-on'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Rome'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='GraniteRapids'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='prefetchiti'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='prefetchiti'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx10'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx10-128'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx10-256'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx10-512'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='prefetchiti'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell-noTSX'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='IvyBridge'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='IvyBridge-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='IvyBridge-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='KnightsMill'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512er'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512pf'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='KnightsMill-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512er'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512pf'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Opteron_G4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fma4'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xop'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fma4'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xop'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Opteron_G5'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fma4'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tbm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xop'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fma4'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tbm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xop'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='SapphireRapids'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='SierraForest'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cmpccxadd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='SierraForest-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cmpccxadd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Client'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Snowridge'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='core-capability'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mpx'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='split-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Snowridge-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='core-capability'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mpx'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='split-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Snowridge-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='core-capability'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='split-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Snowridge-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='core-capability'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='split-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Snowridge-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='athlon'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnow'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnowext'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='athlon-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnow'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnowext'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='core2duo'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='core2duo-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='coreduo'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='coreduo-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='n270'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='n270-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='phenom'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnow'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnowext'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='phenom-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnow'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnowext'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </mode>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  </cpu>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <memoryBacking supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <enum name='sourceType'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <value>file</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <value>anonymous</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <value>memfd</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  </memoryBacking>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <devices>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <disk supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='diskDevice'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>disk</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>cdrom</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>floppy</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>lun</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='bus'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>ide</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>fdc</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>scsi</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>usb</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>sata</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='model'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio-transitional</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio-non-transitional</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </disk>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <graphics supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='type'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>vnc</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>egl-headless</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>dbus</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </graphics>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <video supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='modelType'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>vga</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>cirrus</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>none</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>bochs</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>ramfb</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </video>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <hostdev supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='mode'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>subsystem</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='startupPolicy'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>default</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>mandatory</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>requisite</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>optional</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='subsysType'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>usb</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>pci</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>scsi</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='capsType'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='pciBackend'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </hostdev>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <rng supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='model'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio-transitional</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio-non-transitional</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='backendModel'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>random</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>egd</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>builtin</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </rng>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <filesystem supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='driverType'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>path</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>handle</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtiofs</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </filesystem>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <tpm supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='model'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>tpm-tis</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>tpm-crb</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='backendModel'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>emulator</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>external</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='backendVersion'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>2.0</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </tpm>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <redirdev supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='bus'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>usb</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </redirdev>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <channel supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='type'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>pty</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>unix</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </channel>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <crypto supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='model'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='type'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>qemu</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='backendModel'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>builtin</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </crypto>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <interface supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='backendType'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>default</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>passt</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </interface>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <panic supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='model'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>isa</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>hyperv</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </panic>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <console supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='type'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>null</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>vc</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>pty</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>dev</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>file</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>pipe</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>stdio</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>udp</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>tcp</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>unix</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>qemu-vdagent</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>dbus</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </console>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  </devices>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <features>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <gic supported='no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <vmcoreinfo supported='yes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <genid supported='yes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <backingStoreInput supported='yes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <backup supported='yes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <async-teardown supported='yes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <ps2 supported='yes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <sev supported='no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <sgx supported='no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <hyperv supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='features'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>relaxed</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>vapic</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>spinlocks</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>vpindex</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>runtime</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>synic</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>stimer</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>reset</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>vendor_id</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>frequencies</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>reenlightenment</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>tlbflush</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>ipi</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>avic</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>emsr_bitmap</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>xmm_input</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <defaults>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <spinlocks>4095</spinlocks>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <stimer_direct>on</stimer_direct>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </defaults>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </hyperv>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <launchSecurity supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='sectype'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>tdx</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </launchSecurity>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  </features>
Nov 29 01:43:05 np0005539510 nova_compute[230961]: </domainCapabilities>
Nov 29 01:43:05 np0005539510 nova_compute[230961]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:43:05 np0005539510 nova_compute[230961]: 2025-11-29 06:43:05.615 230965 DEBUG nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 29 01:43:05 np0005539510 nova_compute[230961]: <domainCapabilities>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <domain>kvm</domain>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <arch>x86_64</arch>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <vcpu max='4096'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <iothreads supported='yes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <os supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <enum name='firmware'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <value>efi</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <loader supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='type'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>rom</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>pflash</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='readonly'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>yes</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>no</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='secure'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>yes</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>no</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </loader>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  </os>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <cpu>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>on</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>off</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </mode>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <mode name='maximum' supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='maximumMigratable'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>on</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>off</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </mode>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <mode name='host-model' supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <vendor>AMD</vendor>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='x2apic'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='stibp'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='ssbd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='succor'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='ibrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='lbrv'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </mode>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <mode name='custom' supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Broadwell-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cascadelake-Server'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cooperlake'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cooperlake-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Cooperlake-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Denverton'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mpx'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Denverton-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mpx'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Denverton-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Denverton-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Dhyana-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Genoa'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amd-psfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='auto-ibrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='stibp-always-on'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amd-psfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='auto-ibrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='stibp-always-on'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Milan'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amd-psfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='stibp-always-on'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Rome'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='EPYC-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='GraniteRapids'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='prefetchiti'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='prefetchiti'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx10'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx10-128'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx10-256'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx10-512'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='prefetchiti'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell-noTSX'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Haswell-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='IvyBridge'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='IvyBridge-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='IvyBridge-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='KnightsMill'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512er'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512pf'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='KnightsMill-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512er'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512pf'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Opteron_G4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fma4'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xop'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fma4'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xop'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Opteron_G5'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fma4'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tbm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xop'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fma4'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tbm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xop'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='SapphireRapids'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='amx-tile'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-bf16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-fp16'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bitalg'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrc'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fzrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='la57'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='taa-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xfd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='SierraForest'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cmpccxadd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='SierraForest-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-ifma'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cmpccxadd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fbsdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='fsrs'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ibrs-all'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mcdt-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pbrsb-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='psdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='serialize'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vaes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Client'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='hle'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='rtm'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512bw'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512cd'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512dq'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512f'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='avx512vl'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='invpcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pcid'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='pku'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Snowridge'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='core-capability'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mpx'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='split-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Snowridge-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='core-capability'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='mpx'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='split-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Snowridge-v2'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='core-capability'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='split-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Snowridge-v3'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='core-capability'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='split-lock-detect'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='Snowridge-v4'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='cldemote'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='erms'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='gfni'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdir64b'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='movdiri'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='xsaves'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='athlon'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnow'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnowext'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='athlon-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnow'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnowext'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='core2duo'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='core2duo-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='coreduo'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='coreduo-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='n270'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='n270-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='ss'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='phenom'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnow'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnowext'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <blockers model='phenom-v1'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnow'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <feature name='3dnowext'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </blockers>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </mode>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  </cpu>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <memoryBacking supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <enum name='sourceType'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <value>file</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <value>anonymous</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <value>memfd</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  </memoryBacking>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <devices>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <disk supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='diskDevice'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>disk</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>cdrom</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>floppy</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>lun</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='bus'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>fdc</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>scsi</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>usb</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>sata</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='model'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio-transitional</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio-non-transitional</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </disk>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <graphics supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='type'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>vnc</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>egl-headless</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>dbus</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </graphics>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <video supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='modelType'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>vga</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>cirrus</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>none</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>bochs</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>ramfb</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </video>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <hostdev supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='mode'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>subsystem</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='startupPolicy'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>default</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>mandatory</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>requisite</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>optional</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='subsysType'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>usb</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>pci</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>scsi</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='capsType'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='pciBackend'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </hostdev>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <rng supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='model'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio-transitional</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtio-non-transitional</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='backendModel'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>random</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>egd</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>builtin</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </rng>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <filesystem supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='driverType'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>path</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>handle</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>virtiofs</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </filesystem>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <tpm supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='model'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>tpm-tis</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>tpm-crb</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='backendModel'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>emulator</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>external</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='backendVersion'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>2.0</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </tpm>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <redirdev supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='bus'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>usb</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </redirdev>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <channel supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='type'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>pty</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>unix</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </channel>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <crypto supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='model'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='type'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>qemu</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='backendModel'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>builtin</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </crypto>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <interface supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='backendType'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>default</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>passt</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </interface>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <panic supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='model'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>isa</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>hyperv</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </panic>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <console supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='type'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>null</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>vc</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>pty</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>dev</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>file</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>pipe</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>stdio</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>udp</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>tcp</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>unix</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>qemu-vdagent</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>dbus</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </console>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  </devices>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <features>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <gic supported='no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <vmcoreinfo supported='yes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <genid supported='yes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <backingStoreInput supported='yes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <backup supported='yes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <async-teardown supported='yes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <ps2 supported='yes'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <sev supported='no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <sgx supported='no'/>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <hyperv supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='features'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>relaxed</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>vapic</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>spinlocks</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>vpindex</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>runtime</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>synic</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>stimer</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>reset</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>vendor_id</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>frequencies</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>reenlightenment</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>tlbflush</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>ipi</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>avic</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>emsr_bitmap</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>xmm_input</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <defaults>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <spinlocks>4095</spinlocks>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <stimer_direct>on</stimer_direct>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </defaults>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </hyperv>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    <launchSecurity supported='yes'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      <enum name='sectype'>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:        <value>tdx</value>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:      </enum>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:    </launchSecurity>
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  </features>
Nov 29 01:43:05 np0005539510 nova_compute[230961]: </domainCapabilities>
Nov 29 01:43:05 np0005539510 nova_compute[230961]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:43:05 np0005539510 nova_compute[230961]: 2025-11-29 06:43:05.673 230965 DEBUG nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 29 01:43:05 np0005539510 nova_compute[230961]: 2025-11-29 06:43:05.674 230965 DEBUG nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 29 01:43:05 np0005539510 nova_compute[230961]: 2025-11-29 06:43:05.674 230965 DEBUG nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 29 01:43:05 np0005539510 nova_compute[230961]: 2025-11-29 06:43:05.675 230965 INFO nova.virt.libvirt.host [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Secure Boot support detected#033[00m
Nov 29 01:43:05 np0005539510 nova_compute[230961]: 2025-11-29 06:43:05.677 230965 INFO nova.virt.libvirt.driver [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 29 01:43:05 np0005539510 nova_compute[230961]: 2025-11-29 06:43:05.677 230965 INFO nova.virt.libvirt.driver [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 29 01:43:05 np0005539510 nova_compute[230961]: 2025-11-29 06:43:05.693 230965 DEBUG nova.virt.libvirt.driver [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] cpu compare xml: <cpu match="exact">
Nov 29 01:43:05 np0005539510 nova_compute[230961]:  <model>Nehalem</model>
Nov 29 01:43:05 np0005539510 nova_compute[230961]: </cpu>
Nov 29 01:43:05 np0005539510 nova_compute[230961]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Nov 29 01:43:05 np0005539510 nova_compute[230961]: 2025-11-29 06:43:05.697 230965 DEBUG nova.virt.libvirt.driver [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Nov 29 01:43:05 np0005539510 nova_compute[230961]: 2025-11-29 06:43:05.751 230965 INFO nova.virt.node [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Determined node identity 98b21ca7-b42c-4765-935a-26a89197ffb9 from /var/lib/nova/compute_id#033[00m
Nov 29 01:43:05 np0005539510 nova_compute[230961]: 2025-11-29 06:43:05.778 230965 WARNING nova.compute.manager [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Compute nodes ['98b21ca7-b42c-4765-935a-26a89197ffb9'] for host compute-2.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Nov 29 01:43:05 np0005539510 nova_compute[230961]: 2025-11-29 06:43:05.816 230965 INFO nova.compute.manager [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Nov 29 01:43:05 np0005539510 nova_compute[230961]: 2025-11-29 06:43:05.863 230965 WARNING nova.compute.manager [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] No compute node record found for host compute-2.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Nov 29 01:43:05 np0005539510 nova_compute[230961]: 2025-11-29 06:43:05.863 230965 DEBUG oslo_concurrency.lockutils [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:43:05 np0005539510 nova_compute[230961]: 2025-11-29 06:43:05.864 230965 DEBUG oslo_concurrency.lockutils [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:43:05 np0005539510 nova_compute[230961]: 2025-11-29 06:43:05.864 230965 DEBUG oslo_concurrency.lockutils [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:43:05 np0005539510 nova_compute[230961]: 2025-11-29 06:43:05.864 230965 DEBUG nova.compute.resource_tracker [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:43:05 np0005539510 nova_compute[230961]: 2025-11-29 06:43:05.865 230965 DEBUG oslo_concurrency.processutils [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:43:05 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:43:05 np0005539510 python3.9[231695]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 29 01:43:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:43:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:06.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:43:06 np0005539510 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:43:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:06.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:06 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:43:06 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3410839554' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:43:06 np0005539510 nova_compute[230961]: 2025-11-29 06:43:06.302 230965 DEBUG oslo_concurrency.processutils [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:43:06 np0005539510 systemd[1]: Starting libvirt nodedev daemon...
Nov 29 01:43:06 np0005539510 systemd[1]: Started libvirt nodedev daemon.
Nov 29 01:43:06 np0005539510 nova_compute[230961]: 2025-11-29 06:43:06.599 230965 WARNING nova.virt.libvirt.driver [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:43:06 np0005539510 nova_compute[230961]: 2025-11-29 06:43:06.600 230965 DEBUG nova.compute.resource_tracker [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5295MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:43:06 np0005539510 nova_compute[230961]: 2025-11-29 06:43:06.600 230965 DEBUG oslo_concurrency.lockutils [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:43:06 np0005539510 nova_compute[230961]: 2025-11-29 06:43:06.601 230965 DEBUG oslo_concurrency.lockutils [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:43:06 np0005539510 nova_compute[230961]: 2025-11-29 06:43:06.664 230965 WARNING nova.compute.resource_tracker [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] No compute node record for compute-2.ctlplane.example.com:98b21ca7-b42c-4765-935a-26a89197ffb9: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 98b21ca7-b42c-4765-935a-26a89197ffb9 could not be found.#033[00m
Nov 29 01:43:06 np0005539510 nova_compute[230961]: 2025-11-29 06:43:06.702 230965 INFO nova.compute.resource_tracker [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Compute node record created for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com with uuid: 98b21ca7-b42c-4765-935a-26a89197ffb9#033[00m
Nov 29 01:43:06 np0005539510 nova_compute[230961]: 2025-11-29 06:43:06.812 230965 DEBUG nova.compute.resource_tracker [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:43:06 np0005539510 nova_compute[230961]: 2025-11-29 06:43:06.812 230965 DEBUG nova.compute.resource_tracker [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:43:07 np0005539510 python3.9[231916]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:43:07 np0005539510 systemd[1]: Stopping nova_compute container...
Nov 29 01:43:07 np0005539510 nova_compute[230961]: 2025-11-29 06:43:07.129 230965 DEBUG oslo_concurrency.lockutils [None req-2083a47b-be3b-4c75-9b2e-c0018c8dc56f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.528s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:43:07 np0005539510 nova_compute[230961]: 2025-11-29 06:43:07.129 230965 DEBUG oslo_concurrency.lockutils [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:43:07 np0005539510 nova_compute[230961]: 2025-11-29 06:43:07.129 230965 DEBUG oslo_concurrency.lockutils [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:43:07 np0005539510 nova_compute[230961]: 2025-11-29 06:43:07.129 230965 DEBUG oslo_concurrency.lockutils [None req-201417c7-9f8c-4a54-b417-c61f274e8057 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:43:07 np0005539510 virtqemud[231501]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Nov 29 01:43:07 np0005539510 virtqemud[231501]: hostname: compute-2
Nov 29 01:43:07 np0005539510 virtqemud[231501]: End of file while reading data: Input/output error
Nov 29 01:43:07 np0005539510 systemd[1]: libpod-e1b49bbceae24f0cb59155d974e14d2fdbe66d656c015d4c0aaa99f38eb23562.scope: Deactivated successfully.
Nov 29 01:43:07 np0005539510 systemd[1]: libpod-e1b49bbceae24f0cb59155d974e14d2fdbe66d656c015d4c0aaa99f38eb23562.scope: Consumed 3.677s CPU time.
Nov 29 01:43:07 np0005539510 podman[231920]: 2025-11-29 06:43:07.56070514 +0000 UTC m=+0.469410406 container died e1b49bbceae24f0cb59155d974e14d2fdbe66d656c015d4c0aaa99f38eb23562 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 01:43:07 np0005539510 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e1b49bbceae24f0cb59155d974e14d2fdbe66d656c015d4c0aaa99f38eb23562-userdata-shm.mount: Deactivated successfully.
Nov 29 01:43:07 np0005539510 systemd[1]: var-lib-containers-storage-overlay-0541d2257bfa6bdc6ff93b1f3fca62df03cb2f349676aeee8dc393c9423a8e52-merged.mount: Deactivated successfully.
Nov 29 01:43:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:08.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:08.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:43:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:10.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:43:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:10.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:10 np0005539510 podman[231920]: 2025-11-29 06:43:10.362167102 +0000 UTC m=+3.270872378 container cleanup e1b49bbceae24f0cb59155d974e14d2fdbe66d656c015d4c0aaa99f38eb23562 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=nova_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:43:10 np0005539510 podman[231920]: nova_compute
Nov 29 01:43:10 np0005539510 podman[231951]: nova_compute
Nov 29 01:43:10 np0005539510 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Nov 29 01:43:10 np0005539510 systemd[1]: Stopped nova_compute container.
Nov 29 01:43:10 np0005539510 systemd[1]: Starting nova_compute container...
Nov 29 01:43:10 np0005539510 systemd[1]: Started libcrun container.
Nov 29 01:43:10 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0541d2257bfa6bdc6ff93b1f3fca62df03cb2f349676aeee8dc393c9423a8e52/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:10 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0541d2257bfa6bdc6ff93b1f3fca62df03cb2f349676aeee8dc393c9423a8e52/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:10 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0541d2257bfa6bdc6ff93b1f3fca62df03cb2f349676aeee8dc393c9423a8e52/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:10 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0541d2257bfa6bdc6ff93b1f3fca62df03cb2f349676aeee8dc393c9423a8e52/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:10 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0541d2257bfa6bdc6ff93b1f3fca62df03cb2f349676aeee8dc393c9423a8e52/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:10 np0005539510 podman[231965]: 2025-11-29 06:43:10.545604132 +0000 UTC m=+0.081307690 container init e1b49bbceae24f0cb59155d974e14d2fdbe66d656c015d4c0aaa99f38eb23562 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251125)
Nov 29 01:43:10 np0005539510 podman[231965]: 2025-11-29 06:43:10.553652091 +0000 UTC m=+0.089355639 container start e1b49bbceae24f0cb59155d974e14d2fdbe66d656c015d4c0aaa99f38eb23562 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, io.buildah.version=1.41.3)
Nov 29 01:43:10 np0005539510 podman[231965]: nova_compute
Nov 29 01:43:10 np0005539510 nova_compute[231979]: + sudo -E kolla_set_configs
Nov 29 01:43:10 np0005539510 systemd[1]: Started nova_compute container.
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Validating config file
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Copying service configuration files
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Deleting /etc/ceph
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Creating directory /etc/ceph
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Setting permission for /etc/ceph
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Writing out command to execute
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:43:10 np0005539510 nova_compute[231979]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 29 01:43:10 np0005539510 nova_compute[231979]: ++ cat /run_command
Nov 29 01:43:10 np0005539510 nova_compute[231979]: + CMD=nova-compute
Nov 29 01:43:10 np0005539510 nova_compute[231979]: + ARGS=
Nov 29 01:43:10 np0005539510 nova_compute[231979]: + sudo kolla_copy_cacerts
Nov 29 01:43:10 np0005539510 nova_compute[231979]: + [[ ! -n '' ]]
Nov 29 01:43:10 np0005539510 nova_compute[231979]: + . kolla_extend_start
Nov 29 01:43:10 np0005539510 nova_compute[231979]: Running command: 'nova-compute'
Nov 29 01:43:10 np0005539510 nova_compute[231979]: + echo 'Running command: '\''nova-compute'\'''
Nov 29 01:43:10 np0005539510 nova_compute[231979]: + umask 0022
Nov 29 01:43:10 np0005539510 nova_compute[231979]: + exec nova-compute
Nov 29 01:43:10 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:43:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:12.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:12.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:12 np0005539510 python3.9[232193]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 29 01:43:12 np0005539510 systemd[1]: Started libpod-conmon-d8f52aa1a5a9bded18911397b98d2bd5886244cb8e06988f3ff513cabeca2db2.scope.
Nov 29 01:43:12 np0005539510 nova_compute[231979]: 2025-11-29 06:43:12.678 231983 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 29 01:43:12 np0005539510 nova_compute[231979]: 2025-11-29 06:43:12.680 231983 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 29 01:43:12 np0005539510 nova_compute[231979]: 2025-11-29 06:43:12.680 231983 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 29 01:43:12 np0005539510 nova_compute[231979]: 2025-11-29 06:43:12.681 231983 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Nov 29 01:43:12 np0005539510 systemd[1]: Started libcrun container.
Nov 29 01:43:12 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9edd1240065e5824189d7d86d8b821e543ee68922e9bc4b93c6cec0888f7278b/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:12 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9edd1240065e5824189d7d86d8b821e543ee68922e9bc4b93c6cec0888f7278b/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:12 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9edd1240065e5824189d7d86d8b821e543ee68922e9bc4b93c6cec0888f7278b/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:12 np0005539510 podman[232222]: 2025-11-29 06:43:12.717039106 +0000 UTC m=+0.108277622 container init d8f52aa1a5a9bded18911397b98d2bd5886244cb8e06988f3ff513cabeca2db2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20251125, config_id=edpm, container_name=nova_compute_init, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible)
Nov 29 01:43:12 np0005539510 podman[232222]: 2025-11-29 06:43:12.723977467 +0000 UTC m=+0.115215963 container start d8f52aa1a5a9bded18911397b98d2bd5886244cb8e06988f3ff513cabeca2db2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, container_name=nova_compute_init, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:43:12 np0005539510 python3.9[232193]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Nov 29 01:43:12 np0005539510 nova_compute_init[232244]: INFO:nova_statedir:Applying nova statedir ownership
Nov 29 01:43:12 np0005539510 nova_compute_init[232244]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Nov 29 01:43:12 np0005539510 nova_compute_init[232244]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Nov 29 01:43:12 np0005539510 nova_compute_init[232244]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Nov 29 01:43:12 np0005539510 nova_compute_init[232244]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Nov 29 01:43:12 np0005539510 nova_compute_init[232244]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Nov 29 01:43:12 np0005539510 nova_compute_init[232244]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Nov 29 01:43:12 np0005539510 nova_compute_init[232244]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Nov 29 01:43:12 np0005539510 nova_compute_init[232244]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Nov 29 01:43:12 np0005539510 nova_compute_init[232244]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Nov 29 01:43:12 np0005539510 nova_compute_init[232244]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Nov 29 01:43:12 np0005539510 nova_compute_init[232244]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:43:12 np0005539510 nova_compute_init[232244]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Nov 29 01:43:12 np0005539510 nova_compute_init[232244]: INFO:nova_statedir:Nova statedir ownership complete
Nov 29 01:43:12 np0005539510 systemd[1]: libpod-d8f52aa1a5a9bded18911397b98d2bd5886244cb8e06988f3ff513cabeca2db2.scope: Deactivated successfully.
Nov 29 01:43:12 np0005539510 nova_compute[231979]: 2025-11-29 06:43:12.826 231983 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:43:12 np0005539510 podman[232259]: 2025-11-29 06:43:12.843170512 +0000 UTC m=+0.038793171 container died d8f52aa1a5a9bded18911397b98d2bd5886244cb8e06988f3ff513cabeca2db2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Nov 29 01:43:12 np0005539510 nova_compute[231979]: 2025-11-29 06:43:12.849 231983 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:43:12 np0005539510 nova_compute[231979]: 2025-11-29 06:43:12.850 231983 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 29 01:43:12 np0005539510 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d8f52aa1a5a9bded18911397b98d2bd5886244cb8e06988f3ff513cabeca2db2-userdata-shm.mount: Deactivated successfully.
Nov 29 01:43:12 np0005539510 systemd[1]: var-lib-containers-storage-overlay-9edd1240065e5824189d7d86d8b821e543ee68922e9bc4b93c6cec0888f7278b-merged.mount: Deactivated successfully.
Nov 29 01:43:12 np0005539510 podman[232259]: 2025-11-29 06:43:12.875910175 +0000 UTC m=+0.071532814 container cleanup d8f52aa1a5a9bded18911397b98d2bd5886244cb8e06988f3ff513cabeca2db2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm)
Nov 29 01:43:12 np0005539510 systemd[1]: libpod-conmon-d8f52aa1a5a9bded18911397b98d2bd5886244cb8e06988f3ff513cabeca2db2.scope: Deactivated successfully.
Nov 29 01:43:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:43:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:14.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:43:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:14.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:43:15.134 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:43:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:43:15.134 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:43:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:43:15.135 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:43:15 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:43:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:16.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:16 np0005539510 systemd[1]: session-50.scope: Deactivated successfully.
Nov 29 01:43:16 np0005539510 systemd[1]: session-50.scope: Consumed 2min 16.828s CPU time.
Nov 29 01:43:16 np0005539510 systemd-logind[784]: Session 50 logged out. Waiting for processes to exit.
Nov 29 01:43:16 np0005539510 systemd-logind[784]: Removed session 50.
Nov 29 01:43:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:43:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:16.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.064 231983 INFO nova.virt.driver [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.177 231983 INFO nova.compute.provider_config [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.694 231983 DEBUG oslo_concurrency.lockutils [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.695 231983 DEBUG oslo_concurrency.lockutils [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.695 231983 DEBUG oslo_concurrency.lockutils [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.696 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.696 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.697 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.697 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.697 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.697 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.698 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.698 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.698 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.698 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.699 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.699 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.699 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.700 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.700 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.700 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.701 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.701 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.701 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.701 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.702 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.702 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.702 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.703 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.703 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.703 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.704 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.704 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.705 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.706 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.706 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.706 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.706 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.707 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.707 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.707 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.707 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.708 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.708 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.708 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.708 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.709 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.709 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.709 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.709 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.710 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.710 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.710 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.710 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.711 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.711 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.711 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.711 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.711 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.712 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.712 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.712 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.712 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.712 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.713 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.713 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.713 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.713 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.713 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.714 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.714 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.714 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.715 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.715 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.715 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.715 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.715 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.715 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.715 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.716 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.716 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.716 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.716 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.716 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.716 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.716 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.716 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.717 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.717 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.717 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.717 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.717 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.717 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.717 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.718 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.718 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.718 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.718 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.718 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.718 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.718 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.719 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.719 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.719 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.719 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.719 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.719 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.719 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.719 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.720 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.720 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.720 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.720 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.720 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.720 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.720 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.720 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.721 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.721 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.721 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.721 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.721 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.721 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.721 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.722 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.722 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.722 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.722 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.722 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.722 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.722 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.722 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.723 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.723 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.723 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.723 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.723 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.723 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.723 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.724 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.724 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.724 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.724 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.724 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.724 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.724 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.724 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.725 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.725 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.725 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.725 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.725 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.725 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.725 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.726 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.726 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.726 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.726 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.726 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.726 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.726 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.727 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.727 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.727 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.727 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.727 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.727 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.727 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.728 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.728 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.728 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.728 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.728 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.728 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.728 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.729 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.729 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.729 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.729 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.729 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.729 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.730 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.730 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.730 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.730 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.730 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.731 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.731 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.731 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.731 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.731 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.731 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.731 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.732 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.732 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.732 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.732 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.732 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.732 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.732 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.732 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.733 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.733 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.733 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.733 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.733 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.733 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.733 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.734 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.734 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.734 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.734 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.734 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.734 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.734 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.735 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.735 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.735 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.735 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.735 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.735 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.735 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.736 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.736 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.736 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.736 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.736 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.736 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.736 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.737 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.737 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.737 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.737 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.737 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.737 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.737 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.738 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.738 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.738 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.738 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.738 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.738 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.738 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.739 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.739 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.739 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.739 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.739 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.739 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.740 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.740 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.740 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.740 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.740 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.740 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.740 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.740 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.741 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.741 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.741 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.741 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.741 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.741 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.741 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.742 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.742 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.742 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.742 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.742 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.742 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.743 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.743 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.743 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.743 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.743 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.743 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.743 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.744 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.744 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.744 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.744 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.744 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.744 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.744 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.745 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.745 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.745 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.745 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.745 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.745 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.745 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.746 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.746 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.746 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.746 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.746 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.746 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.746 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.747 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.747 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.747 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.747 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.747 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.747 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.748 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.748 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.748 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.748 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.748 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.748 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.749 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.749 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.749 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.749 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.749 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.749 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.750 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.750 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.750 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.750 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.750 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.750 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.751 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.751 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.751 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.751 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.751 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.751 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.751 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.752 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.752 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.752 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.752 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.752 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.752 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.752 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.752 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.753 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.753 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.753 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.753 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.753 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.753 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.754 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.754 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.754 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.754 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.754 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.754 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.754 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.754 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.755 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.755 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.755 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.755 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.755 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.755 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.755 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.756 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.756 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.756 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.756 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.756 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.757 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.757 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.757 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.757 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.757 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.757 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.757 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.758 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.758 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.758 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.758 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.758 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.758 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.758 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.759 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.759 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.759 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.759 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.759 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.759 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.759 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.759 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.760 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.760 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.760 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.760 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.760 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.760 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.761 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.761 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.761 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.761 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.761 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.761 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.761 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.762 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.762 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.762 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.762 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.762 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.762 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.762 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.763 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.763 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.763 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.763 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.763 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.763 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.763 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.764 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.764 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.764 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.764 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.764 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.764 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.764 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.765 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.765 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.765 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.765 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.765 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.765 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.765 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.766 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.766 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.766 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.766 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.766 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.766 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.766 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.767 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.767 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.767 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.767 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.767 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.767 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.767 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.768 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.768 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.768 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.768 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.768 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.768 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.768 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.768 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.769 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.769 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.769 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.769 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.769 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.769 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.770 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.770 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.770 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.770 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.770 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.771 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.771 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.771 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.771 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.771 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.771 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.771 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.772 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.772 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.772 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.772 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.772 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.772 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.772 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.773 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.773 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.773 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.773 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.773 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.773 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.773 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.774 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.774 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.774 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.774 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.774 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.774 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.774 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.775 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.775 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.775 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.775 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.775 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.775 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.775 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.776 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.776 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.776 231983 WARNING oslo_config.cfg [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 29 01:43:17 np0005539510 nova_compute[231979]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 29 01:43:17 np0005539510 nova_compute[231979]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 29 01:43:17 np0005539510 nova_compute[231979]: and ``live_migration_inbound_addr`` respectively.
Nov 29 01:43:17 np0005539510 nova_compute[231979]: ).  Its value may be silently ignored in the future.#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.776 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.776 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.777 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.777 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.777 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.777 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.777 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.778 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.778 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.778 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.778 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.778 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.779 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.779 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.779 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.779 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.779 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.780 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.780 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.rbd_secret_uuid        = 336ec58c-893b-528f-a0c1-6ed1196bc047 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.780 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.780 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.780 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.781 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.781 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.781 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.781 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.781 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.781 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.782 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.782 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.782 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.782 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.783 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.783 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.783 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.783 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.783 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.783 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.784 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.784 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.784 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.784 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.784 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.785 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.785 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.785 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.785 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.785 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.786 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.786 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.786 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.786 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.786 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.787 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.787 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.787 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.787 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.787 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.787 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.788 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.788 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.788 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.788 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.788 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.789 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.789 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.789 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.789 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.789 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.789 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.790 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.790 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.790 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.790 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.790 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.791 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.791 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.791 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.791 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.791 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.792 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.792 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.792 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.792 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.792 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.792 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.793 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.793 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.793 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.793 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.793 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.794 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.794 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.794 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.794 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.794 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.795 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.795 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.795 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.795 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.795 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.795 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.796 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.796 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.796 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.796 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.796 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.797 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.797 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.797 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.797 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.797 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.798 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.798 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.798 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.798 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.799 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.799 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.799 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.799 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.799 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.799 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.800 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.800 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.800 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.800 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.800 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.801 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.801 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.801 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.801 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.801 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.802 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.802 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.802 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.802 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.802 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.802 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.803 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.803 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.803 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.803 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.804 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.804 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.804 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.804 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.804 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.804 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.805 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.805 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.805 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.805 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.805 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.806 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.806 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.806 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.806 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.806 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.807 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.807 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.807 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.807 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.807 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.808 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.808 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.808 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.808 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.808 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.808 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.809 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.809 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.809 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.809 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.809 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.810 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.810 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.810 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.810 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.810 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.811 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.811 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.811 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.811 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.811 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.811 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.812 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.812 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.812 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.812 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.812 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.813 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.813 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.813 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.813 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.813 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.814 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.814 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.814 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.814 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.814 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.815 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.815 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.815 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.815 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.815 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.815 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.816 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.816 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.816 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.816 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.816 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.817 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.817 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.817 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.817 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.817 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.817 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.818 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.818 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.818 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.818 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.818 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.819 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.819 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.819 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.819 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.819 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.819 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.820 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.820 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.820 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.820 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.820 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.821 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.821 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.821 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.821 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.821 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.821 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.822 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.822 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.822 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.822 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.822 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.823 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.823 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.823 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.823 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.823 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.824 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.824 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.824 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.824 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.824 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.825 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.825 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.825 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.825 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.825 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.826 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.826 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.826 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.826 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.826 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.827 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.827 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.827 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.827 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.827 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.828 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.828 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.828 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.828 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.828 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.828 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.829 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.829 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.829 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.829 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.829 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.830 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.830 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.830 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.830 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.830 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.831 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.831 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.831 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.831 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.831 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.832 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.832 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.832 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.832 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.832 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.833 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.833 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.833 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.833 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.833 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.834 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.834 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.834 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.834 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.834 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.835 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.835 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.835 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.835 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.835 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.835 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.836 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.836 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.836 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.836 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.836 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.837 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.837 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.837 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.837 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.837 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.838 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.838 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.838 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.838 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.838 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.838 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.839 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.839 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.839 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.839 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.839 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.840 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.840 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.840 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.840 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.840 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.841 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.841 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.841 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.841 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.841 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.842 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.842 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.842 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.842 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.842 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.842 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.843 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.843 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.843 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.843 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.843 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.844 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.844 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.844 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.844 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.844 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.844 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.845 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.845 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.845 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.845 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.845 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.846 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.846 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.846 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.846 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.846 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.846 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.847 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.847 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.847 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.847 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.847 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.848 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.848 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.848 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.848 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.848 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.848 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.849 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.849 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.849 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.849 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.849 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.850 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.850 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.850 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.850 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.850 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.851 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.851 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.851 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.851 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.851 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.851 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.852 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.852 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.852 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.852 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.852 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.853 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.853 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.853 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.853 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.853 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.854 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.854 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.854 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.854 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.854 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.854 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.855 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.855 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.855 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.855 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.855 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.856 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.856 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.856 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.856 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.856 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.856 231983 DEBUG oslo_service.service [None req-37b852b8-965d-4712-b465-b94722003d9a - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 29 01:43:17 np0005539510 nova_compute[231979]: 2025-11-29 06:43:17.858 231983 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Nov 29 01:43:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:18.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:18.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:19 np0005539510 nova_compute[231979]: 2025-11-29 06:43:19.317 231983 INFO nova.virt.node [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Determined node identity 98b21ca7-b42c-4765-935a-26a89197ffb9 from /var/lib/nova/compute_id#033[00m
Nov 29 01:43:19 np0005539510 nova_compute[231979]: 2025-11-29 06:43:19.318 231983 DEBUG nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Nov 29 01:43:19 np0005539510 nova_compute[231979]: 2025-11-29 06:43:19.319 231983 DEBUG nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Nov 29 01:43:19 np0005539510 nova_compute[231979]: 2025-11-29 06:43:19.319 231983 DEBUG nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Nov 29 01:43:19 np0005539510 nova_compute[231979]: 2025-11-29 06:43:19.319 231983 DEBUG nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Nov 29 01:43:19 np0005539510 nova_compute[231979]: 2025-11-29 06:43:19.332 231983 DEBUG nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f43a411bb20> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Nov 29 01:43:19 np0005539510 nova_compute[231979]: 2025-11-29 06:43:19.337 231983 DEBUG nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f43a411bb20> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Nov 29 01:43:19 np0005539510 nova_compute[231979]: 2025-11-29 06:43:19.337 231983 INFO nova.virt.libvirt.driver [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Connection event '1' reason 'None'#033[00m
Nov 29 01:43:19 np0005539510 nova_compute[231979]: 2025-11-29 06:43:19.344 231983 INFO nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Libvirt host capabilities <capabilities>
Nov 29 01:43:19 np0005539510 nova_compute[231979]: 
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <host>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <uuid>4a1784f4-2c5f-4879-a5f6-acc886e56ebb</uuid>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <cpu>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <arch>x86_64</arch>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model>EPYC-Rome-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <vendor>AMD</vendor>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <microcode version='16777317'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <signature family='23' model='49' stepping='0'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <maxphysaddr mode='emulate' bits='40'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature name='x2apic'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature name='tsc-deadline'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature name='osxsave'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature name='hypervisor'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature name='tsc_adjust'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature name='spec-ctrl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature name='stibp'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature name='arch-capabilities'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature name='ssbd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature name='cmp_legacy'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature name='topoext'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature name='virt-ssbd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature name='lbrv'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature name='tsc-scale'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature name='vmcb-clean'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature name='pause-filter'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature name='pfthreshold'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature name='svme-addr-chk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature name='rdctl-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature name='skip-l1dfl-vmentry'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature name='mds-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature name='pschange-mc-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <pages unit='KiB' size='4'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <pages unit='KiB' size='2048'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <pages unit='KiB' size='1048576'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </cpu>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <power_management>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <suspend_mem/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </power_management>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <iommu support='no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <migration_features>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <live/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <uri_transports>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <uri_transport>tcp</uri_transport>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <uri_transport>rdma</uri_transport>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </uri_transports>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </migration_features>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <topology>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <cells num='1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <cell id='0'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:          <memory unit='KiB'>7864320</memory>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:          <pages unit='KiB' size='4'>1966080</pages>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:          <pages unit='KiB' size='2048'>0</pages>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:          <pages unit='KiB' size='1048576'>0</pages>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:          <distances>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:            <sibling id='0' value='10'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:          </distances>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:          <cpus num='8'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:          </cpus>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        </cell>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </cells>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </topology>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <cache>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </cache>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <secmodel>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model>selinux</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <doi>0</doi>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </secmodel>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <secmodel>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model>dac</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <doi>0</doi>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <baselabel type='kvm'>+107:+107</baselabel>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <baselabel type='qemu'>+107:+107</baselabel>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </secmodel>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  </host>
Nov 29 01:43:19 np0005539510 nova_compute[231979]: 
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <guest>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <os_type>hvm</os_type>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <arch name='i686'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <wordsize>32</wordsize>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <domain type='qemu'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <domain type='kvm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </arch>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <features>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <pae/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <nonpae/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <acpi default='on' toggle='yes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <apic default='on' toggle='no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <cpuselection/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <deviceboot/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <disksnapshot default='on' toggle='no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <externalSnapshot/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </features>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  </guest>
Nov 29 01:43:19 np0005539510 nova_compute[231979]: 
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <guest>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <os_type>hvm</os_type>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <arch name='x86_64'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <wordsize>64</wordsize>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <domain type='qemu'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <domain type='kvm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </arch>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <features>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <acpi default='on' toggle='yes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <apic default='on' toggle='no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <cpuselection/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <deviceboot/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <disksnapshot default='on' toggle='no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <externalSnapshot/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </features>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  </guest>
Nov 29 01:43:19 np0005539510 nova_compute[231979]: 
Nov 29 01:43:19 np0005539510 nova_compute[231979]: </capabilities>
Nov 29 01:43:19 np0005539510 nova_compute[231979]: #033[00m
Nov 29 01:43:19 np0005539510 nova_compute[231979]: 2025-11-29 06:43:19.350 231983 DEBUG nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 29 01:43:19 np0005539510 nova_compute[231979]: 2025-11-29 06:43:19.353 231983 DEBUG nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 29 01:43:19 np0005539510 nova_compute[231979]: <domainCapabilities>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <domain>kvm</domain>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <arch>i686</arch>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <vcpu max='4096'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <iothreads supported='yes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <os supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <enum name='firmware'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <loader supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='type'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>rom</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>pflash</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='readonly'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>yes</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>no</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='secure'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>no</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </loader>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  </os>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <cpu>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>on</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>off</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </mode>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <mode name='maximum' supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='maximumMigratable'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>on</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>off</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </mode>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <mode name='host-model' supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <vendor>AMD</vendor>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='x2apic'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='stibp'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='ssbd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='succor'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='ibrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='lbrv'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </mode>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <mode name='custom' supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cascadelake-Server'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cooperlake'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cooperlake-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cooperlake-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Denverton'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mpx'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Denverton-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mpx'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Denverton-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Denverton-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Dhyana-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Genoa'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amd-psfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='auto-ibrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='stibp-always-on'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amd-psfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='auto-ibrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='stibp-always-on'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Milan'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amd-psfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='stibp-always-on'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Rome'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='GraniteRapids'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-tile'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fbsdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrc'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fzrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mcdt-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pbrsb-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='prefetchiti'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='psdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-tile'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fbsdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrc'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fzrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mcdt-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pbrsb-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='prefetchiti'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='psdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-tile'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx10'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx10-128'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx10-256'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx10-512'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cldemote'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fbsdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrc'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fzrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mcdt-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdir64b'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdiri'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pbrsb-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='prefetchiti'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='psdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell-noTSX'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='IvyBridge'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='IvyBridge-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='IvyBridge-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='KnightsMill'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512er'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512pf'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='KnightsMill-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512er'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512pf'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Opteron_G4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fma4'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xop'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fma4'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xop'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Opteron_G5'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fma4'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tbm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xop'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fma4'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tbm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xop'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='SapphireRapids'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-tile'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrc'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fzrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-tile'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrc'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fzrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-tile'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fbsdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrc'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fzrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='psdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-tile'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cldemote'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fbsdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrc'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fzrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdir64b'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdiri'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='psdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='SierraForest'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cmpccxadd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fbsdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mcdt-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pbrsb-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='psdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='SierraForest-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cmpccxadd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fbsdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mcdt-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pbrsb-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='psdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Client'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Snowridge'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cldemote'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='core-capability'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdir64b'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdiri'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mpx'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='split-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Snowridge-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cldemote'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='core-capability'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdir64b'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdiri'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mpx'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='split-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Snowridge-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cldemote'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='core-capability'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdir64b'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdiri'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='split-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Snowridge-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cldemote'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='core-capability'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdir64b'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdiri'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='split-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Snowridge-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cldemote'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdir64b'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdiri'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='athlon'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnow'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnowext'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='athlon-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnow'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnowext'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='core2duo'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='core2duo-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='coreduo'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='coreduo-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='n270'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='n270-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='phenom'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnow'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnowext'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='phenom-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnow'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnowext'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </mode>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  </cpu>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <memoryBacking supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <enum name='sourceType'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <value>file</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <value>anonymous</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <value>memfd</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  </memoryBacking>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <devices>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <disk supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='diskDevice'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>disk</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>cdrom</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>floppy</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>lun</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='bus'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>fdc</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>scsi</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>usb</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>sata</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='model'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio-transitional</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio-non-transitional</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </disk>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <graphics supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='type'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>vnc</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>egl-headless</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>dbus</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </graphics>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <video supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='modelType'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>vga</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>cirrus</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>none</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>bochs</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>ramfb</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </video>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <hostdev supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='mode'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>subsystem</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='startupPolicy'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>default</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>mandatory</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>requisite</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>optional</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='subsysType'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>usb</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>pci</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>scsi</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='capsType'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='pciBackend'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </hostdev>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <rng supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='model'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio-transitional</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio-non-transitional</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='backendModel'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>random</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>egd</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>builtin</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </rng>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <filesystem supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='driverType'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>path</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>handle</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtiofs</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </filesystem>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <tpm supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='model'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>tpm-tis</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>tpm-crb</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='backendModel'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>emulator</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>external</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='backendVersion'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>2.0</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </tpm>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <redirdev supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='bus'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>usb</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </redirdev>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <channel supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='type'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>pty</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>unix</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </channel>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <crypto supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='model'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='type'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>qemu</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='backendModel'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>builtin</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </crypto>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <interface supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='backendType'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>default</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>passt</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </interface>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <panic supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='model'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>isa</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>hyperv</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </panic>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <console supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='type'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>null</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>vc</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>pty</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>dev</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>file</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>pipe</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>stdio</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>udp</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>tcp</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>unix</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>qemu-vdagent</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>dbus</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </console>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  </devices>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <features>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <gic supported='no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <vmcoreinfo supported='yes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <genid supported='yes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <backingStoreInput supported='yes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <backup supported='yes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <async-teardown supported='yes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <ps2 supported='yes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <sev supported='no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <sgx supported='no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <hyperv supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='features'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>relaxed</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>vapic</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>spinlocks</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>vpindex</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>runtime</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>synic</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>stimer</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>reset</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>vendor_id</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>frequencies</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>reenlightenment</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>tlbflush</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>ipi</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>avic</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>emsr_bitmap</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>xmm_input</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <defaults>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <spinlocks>4095</spinlocks>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <stimer_direct>on</stimer_direct>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </defaults>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </hyperv>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <launchSecurity supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='sectype'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>tdx</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </launchSecurity>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  </features>
Nov 29 01:43:19 np0005539510 nova_compute[231979]: </domainCapabilities>
Nov 29 01:43:19 np0005539510 nova_compute[231979]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:43:19 np0005539510 nova_compute[231979]: 2025-11-29 06:43:19.357 231983 DEBUG nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 29 01:43:19 np0005539510 nova_compute[231979]: <domainCapabilities>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <domain>kvm</domain>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <arch>i686</arch>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <vcpu max='240'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <iothreads supported='yes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <os supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <enum name='firmware'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <loader supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='type'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>rom</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>pflash</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='readonly'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>yes</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>no</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='secure'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>no</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </loader>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  </os>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <cpu>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>on</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>off</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </mode>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <mode name='maximum' supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='maximumMigratable'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>on</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>off</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </mode>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <mode name='host-model' supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <vendor>AMD</vendor>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='x2apic'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='stibp'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='ssbd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='succor'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='ibrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='lbrv'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </mode>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <mode name='custom' supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cascadelake-Server'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cooperlake'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cooperlake-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cooperlake-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Denverton'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mpx'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Denverton-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mpx'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Denverton-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Denverton-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Dhyana-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Genoa'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amd-psfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='auto-ibrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='stibp-always-on'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amd-psfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='auto-ibrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='stibp-always-on'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Milan'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amd-psfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='stibp-always-on'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Rome'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='GraniteRapids'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-tile'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fbsdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrc'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fzrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mcdt-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pbrsb-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='prefetchiti'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='psdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-tile'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fbsdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrc'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fzrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mcdt-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pbrsb-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='prefetchiti'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='psdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-tile'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx10'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx10-128'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx10-256'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx10-512'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cldemote'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fbsdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrc'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fzrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mcdt-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdir64b'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdiri'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pbrsb-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='prefetchiti'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='psdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell-noTSX'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='IvyBridge'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='IvyBridge-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='IvyBridge-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='KnightsMill'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512er'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512pf'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='KnightsMill-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512er'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512pf'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Opteron_G4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fma4'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xop'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fma4'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xop'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Opteron_G5'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fma4'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tbm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xop'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fma4'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tbm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xop'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='SapphireRapids'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-tile'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrc'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fzrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-tile'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrc'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fzrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-tile'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fbsdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrc'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fzrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='psdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-tile'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cldemote'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fbsdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrc'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fzrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdir64b'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdiri'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='psdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='SierraForest'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cmpccxadd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fbsdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mcdt-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pbrsb-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='psdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='SierraForest-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cmpccxadd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fbsdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mcdt-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pbrsb-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='psdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Client'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Snowridge'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cldemote'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='core-capability'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdir64b'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdiri'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mpx'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='split-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Snowridge-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cldemote'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='core-capability'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdir64b'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdiri'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mpx'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='split-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Snowridge-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cldemote'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='core-capability'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdir64b'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdiri'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='split-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Snowridge-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cldemote'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='core-capability'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdir64b'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdiri'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='split-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Snowridge-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cldemote'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdir64b'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdiri'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='athlon'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnow'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnowext'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='athlon-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnow'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnowext'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='core2duo'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='core2duo-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='coreduo'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='coreduo-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='n270'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='n270-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='phenom'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnow'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnowext'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='phenom-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnow'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnowext'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </mode>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  </cpu>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <memoryBacking supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <enum name='sourceType'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <value>file</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <value>anonymous</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <value>memfd</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  </memoryBacking>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <devices>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <disk supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='diskDevice'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>disk</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>cdrom</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>floppy</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>lun</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='bus'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>ide</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>fdc</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>scsi</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>usb</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>sata</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='model'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio-transitional</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio-non-transitional</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </disk>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <graphics supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='type'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>vnc</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>egl-headless</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>dbus</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </graphics>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <video supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='modelType'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>vga</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>cirrus</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>none</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>bochs</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>ramfb</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </video>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <hostdev supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='mode'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>subsystem</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='startupPolicy'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>default</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>mandatory</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>requisite</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>optional</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='subsysType'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>usb</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>pci</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>scsi</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='capsType'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='pciBackend'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </hostdev>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <rng supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='model'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio-transitional</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio-non-transitional</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='backendModel'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>random</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>egd</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>builtin</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </rng>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <filesystem supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='driverType'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>path</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>handle</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtiofs</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </filesystem>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <tpm supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='model'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>tpm-tis</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>tpm-crb</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='backendModel'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>emulator</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>external</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='backendVersion'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>2.0</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </tpm>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <redirdev supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='bus'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>usb</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </redirdev>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <channel supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='type'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>pty</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>unix</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </channel>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <crypto supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='model'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='type'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>qemu</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='backendModel'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>builtin</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </crypto>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <interface supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='backendType'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>default</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>passt</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </interface>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <panic supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='model'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>isa</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>hyperv</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </panic>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <console supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='type'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>null</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>vc</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>pty</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>dev</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>file</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>pipe</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>stdio</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>udp</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>tcp</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>unix</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>qemu-vdagent</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>dbus</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </console>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  </devices>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <features>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <gic supported='no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <vmcoreinfo supported='yes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <genid supported='yes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <backingStoreInput supported='yes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <backup supported='yes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <async-teardown supported='yes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <ps2 supported='yes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <sev supported='no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <sgx supported='no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <hyperv supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='features'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>relaxed</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>vapic</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>spinlocks</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>vpindex</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>runtime</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>synic</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>stimer</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>reset</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>vendor_id</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>frequencies</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>reenlightenment</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>tlbflush</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>ipi</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>avic</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>emsr_bitmap</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>xmm_input</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <defaults>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <spinlocks>4095</spinlocks>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <stimer_direct>on</stimer_direct>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </defaults>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </hyperv>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <launchSecurity supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='sectype'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>tdx</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </launchSecurity>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  </features>
Nov 29 01:43:19 np0005539510 nova_compute[231979]: </domainCapabilities>
Nov 29 01:43:19 np0005539510 nova_compute[231979]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:43:19 np0005539510 nova_compute[231979]: 2025-11-29 06:43:19.383 231983 DEBUG nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 29 01:43:19 np0005539510 nova_compute[231979]: 2025-11-29 06:43:19.389 231983 DEBUG nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 29 01:43:19 np0005539510 nova_compute[231979]: <domainCapabilities>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <domain>kvm</domain>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <arch>x86_64</arch>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <vcpu max='4096'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <iothreads supported='yes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <os supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <enum name='firmware'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <value>efi</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <loader supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='type'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>rom</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>pflash</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='readonly'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>yes</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>no</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='secure'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>yes</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>no</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </loader>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  </os>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <cpu>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>on</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>off</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </mode>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <mode name='maximum' supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='maximumMigratable'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>on</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>off</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </mode>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <mode name='host-model' supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <vendor>AMD</vendor>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='x2apic'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='stibp'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='ssbd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='succor'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='ibrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='lbrv'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </mode>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <mode name='custom' supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cascadelake-Server'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cooperlake'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cooperlake-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cooperlake-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Denverton'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mpx'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Denverton-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mpx'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Denverton-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Denverton-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Dhyana-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Genoa'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amd-psfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='auto-ibrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='stibp-always-on'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amd-psfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='auto-ibrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='stibp-always-on'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Milan'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amd-psfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='stibp-always-on'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Rome'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='GraniteRapids'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-tile'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fbsdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrc'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fzrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mcdt-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pbrsb-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='prefetchiti'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='psdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-tile'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fbsdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrc'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fzrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mcdt-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pbrsb-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='prefetchiti'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='psdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-tile'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx10'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx10-128'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx10-256'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx10-512'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cldemote'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fbsdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrc'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fzrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mcdt-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdir64b'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdiri'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pbrsb-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='prefetchiti'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='psdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell-noTSX'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='IvyBridge'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='IvyBridge-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='IvyBridge-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='KnightsMill'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512er'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512pf'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='KnightsMill-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512er'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512pf'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Opteron_G4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fma4'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xop'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fma4'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xop'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Opteron_G5'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fma4'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tbm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xop'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fma4'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tbm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xop'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='SapphireRapids'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-tile'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrc'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fzrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-tile'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrc'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fzrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-tile'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fbsdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrc'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fzrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='psdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-tile'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cldemote'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fbsdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrc'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fzrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdir64b'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdiri'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='psdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='SierraForest'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cmpccxadd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fbsdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mcdt-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pbrsb-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='psdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='SierraForest-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cmpccxadd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fbsdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mcdt-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pbrsb-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='psdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Client'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Snowridge'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cldemote'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='core-capability'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdir64b'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdiri'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mpx'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='split-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Snowridge-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cldemote'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='core-capability'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdir64b'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdiri'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mpx'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='split-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Snowridge-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cldemote'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='core-capability'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdir64b'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdiri'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='split-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Snowridge-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cldemote'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='core-capability'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdir64b'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdiri'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='split-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Snowridge-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cldemote'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdir64b'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdiri'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='athlon'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnow'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnowext'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='athlon-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnow'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnowext'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='core2duo'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='core2duo-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='coreduo'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='coreduo-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='n270'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='n270-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='phenom'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnow'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnowext'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='phenom-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnow'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnowext'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </mode>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  </cpu>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <memoryBacking supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <enum name='sourceType'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <value>file</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <value>anonymous</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <value>memfd</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  </memoryBacking>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <devices>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <disk supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='diskDevice'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>disk</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>cdrom</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>floppy</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>lun</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='bus'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>fdc</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>scsi</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>usb</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>sata</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='model'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio-transitional</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio-non-transitional</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </disk>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <graphics supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='type'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>vnc</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>egl-headless</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>dbus</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </graphics>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <video supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='modelType'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>vga</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>cirrus</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>none</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>bochs</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>ramfb</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </video>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <hostdev supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='mode'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>subsystem</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='startupPolicy'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>default</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>mandatory</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>requisite</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>optional</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='subsysType'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>usb</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>pci</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>scsi</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='capsType'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='pciBackend'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </hostdev>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <rng supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='model'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio-transitional</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio-non-transitional</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='backendModel'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>random</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>egd</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>builtin</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </rng>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <filesystem supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='driverType'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>path</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>handle</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtiofs</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </filesystem>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <tpm supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='model'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>tpm-tis</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>tpm-crb</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='backendModel'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>emulator</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>external</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='backendVersion'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>2.0</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </tpm>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <redirdev supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='bus'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>usb</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </redirdev>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <channel supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='type'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>pty</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>unix</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </channel>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <crypto supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='model'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='type'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>qemu</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='backendModel'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>builtin</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </crypto>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <interface supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='backendType'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>default</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>passt</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </interface>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <panic supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='model'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>isa</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>hyperv</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </panic>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <console supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='type'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>null</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>vc</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>pty</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>dev</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>file</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>pipe</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>stdio</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>udp</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>tcp</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>unix</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>qemu-vdagent</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>dbus</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </console>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  </devices>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <features>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <gic supported='no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <vmcoreinfo supported='yes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <genid supported='yes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <backingStoreInput supported='yes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <backup supported='yes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <async-teardown supported='yes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <ps2 supported='yes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <sev supported='no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <sgx supported='no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <hyperv supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='features'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>relaxed</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>vapic</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>spinlocks</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>vpindex</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>runtime</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>synic</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>stimer</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>reset</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>vendor_id</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>frequencies</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>reenlightenment</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>tlbflush</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>ipi</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>avic</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>emsr_bitmap</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>xmm_input</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <defaults>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <spinlocks>4095</spinlocks>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <stimer_direct>on</stimer_direct>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </defaults>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </hyperv>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <launchSecurity supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='sectype'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>tdx</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </launchSecurity>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  </features>
Nov 29 01:43:19 np0005539510 nova_compute[231979]: </domainCapabilities>
Nov 29 01:43:19 np0005539510 nova_compute[231979]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:43:19 np0005539510 nova_compute[231979]: 2025-11-29 06:43:19.468 231983 DEBUG nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 29 01:43:19 np0005539510 nova_compute[231979]: <domainCapabilities>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <domain>kvm</domain>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <arch>x86_64</arch>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <vcpu max='240'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <iothreads supported='yes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <os supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <enum name='firmware'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <loader supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='type'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>rom</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>pflash</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='readonly'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>yes</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>no</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='secure'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>no</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </loader>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  </os>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <cpu>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>on</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>off</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </mode>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <mode name='maximum' supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='maximumMigratable'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>on</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>off</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </mode>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <mode name='host-model' supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <vendor>AMD</vendor>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='x2apic'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='stibp'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='ssbd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='succor'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='ibrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='lbrv'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </mode>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <mode name='custom' supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Broadwell-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cascadelake-Server'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cooperlake'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cooperlake-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Cooperlake-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Denverton'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mpx'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Denverton-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mpx'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Denverton-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Denverton-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Dhyana-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Genoa'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amd-psfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='auto-ibrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='stibp-always-on'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amd-psfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='auto-ibrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='stibp-always-on'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Milan'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amd-psfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='no-nested-data-bp'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='null-sel-clr-base'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='stibp-always-on'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Rome'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='EPYC-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='GraniteRapids'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-tile'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fbsdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrc'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fzrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mcdt-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pbrsb-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='prefetchiti'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='psdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-tile'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fbsdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrc'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fzrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mcdt-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pbrsb-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='prefetchiti'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='psdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-tile'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx10'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx10-128'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx10-256'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx10-512'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cldemote'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fbsdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrc'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fzrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mcdt-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdir64b'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdiri'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pbrsb-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='prefetchiti'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='psdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell-noTSX'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Haswell-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='IvyBridge'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='IvyBridge-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='IvyBridge-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='KnightsMill'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512er'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512pf'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='KnightsMill-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-4fmaps'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-4vnniw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512er'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512pf'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Opteron_G4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fma4'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xop'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fma4'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xop'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Opteron_G5'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fma4'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tbm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xop'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fma4'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tbm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xop'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='SapphireRapids'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-tile'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrc'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fzrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-tile'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrc'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fzrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-tile'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fbsdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrc'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fzrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='psdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='amx-tile'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-bf16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-fp16'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bitalg'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vbmi2'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cldemote'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fbsdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrc'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fzrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='la57'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdir64b'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdiri'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='psdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='taa-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='tsx-ldtrk'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xfd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='SierraForest'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cmpccxadd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fbsdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mcdt-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pbrsb-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='psdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='SierraForest-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-ifma'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-ne-convert'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx-vnni-int8'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='bus-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cmpccxadd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fbsdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='fsrs'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ibrs-all'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mcdt-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pbrsb-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='psdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='serialize'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vaes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='vpclmulqdq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Client'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='hle'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='rtm'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512bw'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512cd'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512dq'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512f'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='avx512vl'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='invpcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pcid'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='pku'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Snowridge'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cldemote'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='core-capability'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdir64b'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdiri'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mpx'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='split-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Snowridge-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cldemote'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='core-capability'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdir64b'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdiri'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='mpx'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='split-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Snowridge-v2'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cldemote'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='core-capability'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdir64b'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdiri'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='split-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Snowridge-v3'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cldemote'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='core-capability'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdir64b'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdiri'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='split-lock-detect'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='Snowridge-v4'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='cldemote'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='erms'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='gfni'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdir64b'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='movdiri'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='xsaves'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='athlon'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnow'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnowext'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='athlon-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnow'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnowext'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='core2duo'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='core2duo-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='coreduo'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='coreduo-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='n270'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='n270-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='ss'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='phenom'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnow'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnowext'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <blockers model='phenom-v1'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnow'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <feature name='3dnowext'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </blockers>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </mode>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  </cpu>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <memoryBacking supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <enum name='sourceType'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <value>file</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <value>anonymous</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <value>memfd</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  </memoryBacking>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <devices>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <disk supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='diskDevice'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>disk</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>cdrom</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>floppy</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>lun</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='bus'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>ide</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>fdc</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>scsi</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>usb</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>sata</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='model'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio-transitional</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio-non-transitional</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </disk>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <graphics supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='type'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>vnc</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>egl-headless</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>dbus</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </graphics>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <video supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='modelType'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>vga</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>cirrus</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>none</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>bochs</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>ramfb</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </video>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <hostdev supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='mode'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>subsystem</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='startupPolicy'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>default</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>mandatory</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>requisite</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>optional</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='subsysType'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>usb</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>pci</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>scsi</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='capsType'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='pciBackend'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </hostdev>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <rng supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='model'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio-transitional</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtio-non-transitional</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='backendModel'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>random</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>egd</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>builtin</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </rng>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <filesystem supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='driverType'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>path</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>handle</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>virtiofs</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </filesystem>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <tpm supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='model'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>tpm-tis</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>tpm-crb</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='backendModel'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>emulator</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>external</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='backendVersion'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>2.0</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </tpm>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <redirdev supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='bus'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>usb</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </redirdev>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <channel supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='type'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>pty</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>unix</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </channel>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <crypto supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='model'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='type'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>qemu</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='backendModel'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>builtin</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </crypto>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <interface supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='backendType'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>default</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>passt</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </interface>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <panic supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='model'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>isa</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>hyperv</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </panic>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <console supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='type'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>null</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>vc</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>pty</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>dev</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>file</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>pipe</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>stdio</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>udp</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>tcp</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>unix</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>qemu-vdagent</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>dbus</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </console>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  </devices>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <features>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <gic supported='no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <vmcoreinfo supported='yes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <genid supported='yes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <backingStoreInput supported='yes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <backup supported='yes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <async-teardown supported='yes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <ps2 supported='yes'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <sev supported='no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <sgx supported='no'/>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <hyperv supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='features'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>relaxed</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>vapic</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>spinlocks</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>vpindex</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>runtime</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>synic</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>stimer</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>reset</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>vendor_id</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>frequencies</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>reenlightenment</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>tlbflush</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>ipi</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>avic</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>emsr_bitmap</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>xmm_input</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <defaults>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <spinlocks>4095</spinlocks>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <stimer_direct>on</stimer_direct>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </defaults>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </hyperv>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    <launchSecurity supported='yes'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      <enum name='sectype'>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:        <value>tdx</value>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:      </enum>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:    </launchSecurity>
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  </features>
Nov 29 01:43:19 np0005539510 nova_compute[231979]: </domainCapabilities>
Nov 29 01:43:19 np0005539510 nova_compute[231979]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:43:19 np0005539510 nova_compute[231979]: 2025-11-29 06:43:19.536 231983 DEBUG nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 29 01:43:19 np0005539510 nova_compute[231979]: 2025-11-29 06:43:19.536 231983 INFO nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Secure Boot support detected#033[00m
Nov 29 01:43:19 np0005539510 nova_compute[231979]: 2025-11-29 06:43:19.538 231983 INFO nova.virt.libvirt.driver [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 29 01:43:19 np0005539510 nova_compute[231979]: 2025-11-29 06:43:19.538 231983 INFO nova.virt.libvirt.driver [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 29 01:43:19 np0005539510 nova_compute[231979]: 2025-11-29 06:43:19.547 231983 DEBUG nova.virt.libvirt.driver [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] cpu compare xml: <cpu match="exact">
Nov 29 01:43:19 np0005539510 nova_compute[231979]:  <model>Nehalem</model>
Nov 29 01:43:19 np0005539510 nova_compute[231979]: </cpu>
Nov 29 01:43:19 np0005539510 nova_compute[231979]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Nov 29 01:43:19 np0005539510 nova_compute[231979]: 2025-11-29 06:43:19.549 231983 DEBUG nova.virt.libvirt.driver [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Nov 29 01:43:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:20.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:20 np0005539510 nova_compute[231979]: 2025-11-29 06:43:20.223 231983 DEBUG nova.virt.libvirt.volume.mount [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Nov 29 01:43:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:43:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:20.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:43:20 np0005539510 nova_compute[231979]: 2025-11-29 06:43:20.638 231983 INFO nova.virt.node [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Determined node identity 98b21ca7-b42c-4765-935a-26a89197ffb9 from /var/lib/nova/compute_id#033[00m
Nov 29 01:43:20 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:43:20 np0005539510 nova_compute[231979]: 2025-11-29 06:43:20.936 231983 DEBUG nova.compute.manager [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Verified node 98b21ca7-b42c-4765-935a-26a89197ffb9 matches my host compute-2.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m
Nov 29 01:43:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:43:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:22.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:43:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:43:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:22.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:43:22 np0005539510 nova_compute[231979]: 2025-11-29 06:43:22.320 231983 INFO nova.compute.manager [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Nov 29 01:43:22 np0005539510 nova_compute[231979]: 2025-11-29 06:43:22.767 231983 ERROR nova.compute.manager [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Could not retrieve compute node resource provider 98b21ca7-b42c-4765-935a-26a89197ffb9 and therefore unable to error out any instances stuck in BUILDING state. Error: Failed to retrieve allocations for resource provider 98b21ca7-b42c-4765-935a-26a89197ffb9: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '98b21ca7-b42c-4765-935a-26a89197ffb9' not found: No resource provider with uuid 98b21ca7-b42c-4765-935a-26a89197ffb9 found  ", "request_id": "req-e6c1b923-6add-4afe-8ca9-60e8e7cf5088"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 98b21ca7-b42c-4765-935a-26a89197ffb9: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '98b21ca7-b42c-4765-935a-26a89197ffb9' not found: No resource provider with uuid 98b21ca7-b42c-4765-935a-26a89197ffb9 found  ", "request_id": "req-e6c1b923-6add-4afe-8ca9-60e8e7cf5088"}]}#033[00m
Nov 29 01:43:24 np0005539510 nova_compute[231979]: 2025-11-29 06:43:24.080 231983 DEBUG oslo_concurrency.lockutils [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:43:24 np0005539510 nova_compute[231979]: 2025-11-29 06:43:24.080 231983 DEBUG oslo_concurrency.lockutils [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:43:24 np0005539510 nova_compute[231979]: 2025-11-29 06:43:24.080 231983 DEBUG oslo_concurrency.lockutils [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:43:24 np0005539510 nova_compute[231979]: 2025-11-29 06:43:24.081 231983 DEBUG nova.compute.resource_tracker [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:43:24 np0005539510 nova_compute[231979]: 2025-11-29 06:43:24.081 231983 DEBUG oslo_concurrency.processutils [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:43:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:24.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:24.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:24 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:43:24 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1216257790' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:43:24 np0005539510 nova_compute[231979]: 2025-11-29 06:43:24.484 231983 DEBUG oslo_concurrency.processutils [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.403s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:43:24 np0005539510 nova_compute[231979]: 2025-11-29 06:43:24.701 231983 WARNING nova.virt.libvirt.driver [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:43:24 np0005539510 nova_compute[231979]: 2025-11-29 06:43:24.702 231983 DEBUG nova.compute.resource_tracker [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5289MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:43:24 np0005539510 nova_compute[231979]: 2025-11-29 06:43:24.702 231983 DEBUG oslo_concurrency.lockutils [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:43:24 np0005539510 nova_compute[231979]: 2025-11-29 06:43:24.702 231983 DEBUG oslo_concurrency.lockutils [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:43:24 np0005539510 podman[232360]: 2025-11-29 06:43:24.925505837 +0000 UTC m=+0.072346476 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 01:43:25 np0005539510 podman[232359]: 2025-11-29 06:43:25.060277129 +0000 UTC m=+0.207260311 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:43:25 np0005539510 nova_compute[231979]: 2025-11-29 06:43:25.455 231983 ERROR nova.compute.resource_tracker [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Skipping removal of allocations for deleted instances: Failed to retrieve allocations for resource provider 98b21ca7-b42c-4765-935a-26a89197ffb9: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '98b21ca7-b42c-4765-935a-26a89197ffb9' not found: No resource provider with uuid 98b21ca7-b42c-4765-935a-26a89197ffb9 found  ", "request_id": "req-ef7a20f9-7910-4778-ace3-4e24adb16a59"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 98b21ca7-b42c-4765-935a-26a89197ffb9: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '98b21ca7-b42c-4765-935a-26a89197ffb9' not found: No resource provider with uuid 98b21ca7-b42c-4765-935a-26a89197ffb9 found  ", "request_id": "req-ef7a20f9-7910-4778-ace3-4e24adb16a59"}]}#033[00m
Nov 29 01:43:25 np0005539510 nova_compute[231979]: 2025-11-29 06:43:25.456 231983 DEBUG nova.compute.resource_tracker [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:43:25 np0005539510 nova_compute[231979]: 2025-11-29 06:43:25.456 231983 DEBUG nova.compute.resource_tracker [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:43:25 np0005539510 nova_compute[231979]: 2025-11-29 06:43:25.663 231983 INFO nova.scheduler.client.report [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] [req-1d3f7d80-8081-4395-8499-b37bcb8b8af4] Created resource provider record via placement API for resource provider with UUID 98b21ca7-b42c-4765-935a-26a89197ffb9 and name compute-2.ctlplane.example.com.#033[00m
Nov 29 01:43:25 np0005539510 nova_compute[231979]: 2025-11-29 06:43:25.760 231983 DEBUG oslo_concurrency.processutils [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:43:25 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:43:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:43:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:26.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:43:26 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:43:26 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2300023521' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:43:26 np0005539510 nova_compute[231979]: 2025-11-29 06:43:26.214 231983 DEBUG oslo_concurrency.processutils [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:43:26 np0005539510 nova_compute[231979]: 2025-11-29 06:43:26.221 231983 DEBUG nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 29 01:43:26 np0005539510 nova_compute[231979]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Nov 29 01:43:26 np0005539510 nova_compute[231979]: 2025-11-29 06:43:26.221 231983 INFO nova.virt.libvirt.host [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] kernel doesn't support AMD SEV#033[00m
Nov 29 01:43:26 np0005539510 nova_compute[231979]: 2025-11-29 06:43:26.223 231983 DEBUG nova.compute.provider_tree [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Updating inventory in ProviderTree for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 01:43:26 np0005539510 nova_compute[231979]: 2025-11-29 06:43:26.224 231983 DEBUG nova.virt.libvirt.driver [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:43:26 np0005539510 nova_compute[231979]: 2025-11-29 06:43:26.230 231983 DEBUG nova.virt.libvirt.driver [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Libvirt baseline CPU <cpu>
Nov 29 01:43:26 np0005539510 nova_compute[231979]:  <arch>x86_64</arch>
Nov 29 01:43:26 np0005539510 nova_compute[231979]:  <model>Nehalem</model>
Nov 29 01:43:26 np0005539510 nova_compute[231979]:  <vendor>AMD</vendor>
Nov 29 01:43:26 np0005539510 nova_compute[231979]:  <topology sockets="8" cores="1" threads="1"/>
Nov 29 01:43:26 np0005539510 nova_compute[231979]: </cpu>
Nov 29 01:43:26 np0005539510 nova_compute[231979]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Nov 29 01:43:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:43:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:26.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:43:26 np0005539510 nova_compute[231979]: 2025-11-29 06:43:26.599 231983 DEBUG nova.scheduler.client.report [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Updated inventory for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Nov 29 01:43:26 np0005539510 nova_compute[231979]: 2025-11-29 06:43:26.600 231983 DEBUG nova.compute.provider_tree [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Updating resource provider 98b21ca7-b42c-4765-935a-26a89197ffb9 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 29 01:43:26 np0005539510 nova_compute[231979]: 2025-11-29 06:43:26.600 231983 DEBUG nova.compute.provider_tree [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Updating inventory in ProviderTree for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 01:43:26 np0005539510 nova_compute[231979]: 2025-11-29 06:43:26.765 231983 DEBUG nova.compute.provider_tree [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Updating resource provider 98b21ca7-b42c-4765-935a-26a89197ffb9 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 29 01:43:27 np0005539510 nova_compute[231979]: 2025-11-29 06:43:27.322 231983 DEBUG nova.compute.resource_tracker [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:43:27 np0005539510 nova_compute[231979]: 2025-11-29 06:43:27.323 231983 DEBUG oslo_concurrency.lockutils [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.620s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:43:27 np0005539510 nova_compute[231979]: 2025-11-29 06:43:27.323 231983 DEBUG nova.service [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Nov 29 01:43:27 np0005539510 podman[232429]: 2025-11-29 06:43:27.892832439 +0000 UTC m=+0.057856398 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125)
Nov 29 01:43:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:28.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:28.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:28 np0005539510 nova_compute[231979]: 2025-11-29 06:43:28.663 231983 DEBUG nova.service [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Nov 29 01:43:28 np0005539510 nova_compute[231979]: 2025-11-29 06:43:28.664 231983 DEBUG nova.servicegroup.drivers.db [None req-35441950-af3a-4b04-a3a9-f86bdfaeef91 - - - - - -] DB_Driver: join new ServiceGroup member compute-2.ctlplane.example.com to the compute group, service = <Service: host=compute-2.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Nov 29 01:43:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:30.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:30.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:30 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:43:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:32.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:43:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:32.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:43:33 np0005539510 nova_compute[231979]: 2025-11-29 06:43:33.667 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:43:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:34.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:34.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:34 np0005539510 nova_compute[231979]: 2025-11-29 06:43:34.288 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:43:35 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:43:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:43:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:36.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:43:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:36.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:38.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:38.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:40.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:40.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:40 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:43:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:42.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:42.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:44.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:43:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:44.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:43:45 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:43:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:46.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:46.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:48.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:43:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:48.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:43:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:50.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:50.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:50 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:43:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:43:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:52.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:43:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:52.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:54.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:43:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:54.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:43:55 np0005539510 podman[232566]: 2025-11-29 06:43:55.892539464 +0000 UTC m=+0.051125673 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent)
Nov 29 01:43:55 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:43:55 np0005539510 podman[232565]: 2025-11-29 06:43:55.919838815 +0000 UTC m=+0.080199380 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 01:43:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:43:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:56.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:43:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:56.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:43:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:43:58.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:43:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:43:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:43:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:43:58.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:43:58 np0005539510 podman[232612]: 2025-11-29 06:43:58.903669856 +0000 UTC m=+0.068095675 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 29 01:44:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:44:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:00.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:44:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:44:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:00.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:44:00 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:44:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:02.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:44:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:02.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:44:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:04.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:04.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:04 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:44:04 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:44:04 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:44:05 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:44:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:44:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:06.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:44:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:44:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:06.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:44:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:08.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:44:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:08.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:44:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:44:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:10.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:44:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:44:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:10.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:44:10 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:44:12 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:44:12 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:44:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:12.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:12.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:12 np0005539510 nova_compute[231979]: 2025-11-29 06:44:12.863 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:12 np0005539510 nova_compute[231979]: 2025-11-29 06:44:12.863 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:12 np0005539510 nova_compute[231979]: 2025-11-29 06:44:12.864 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:44:12 np0005539510 nova_compute[231979]: 2025-11-29 06:44:12.864 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:44:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:14.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:44:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:14.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:44:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:44:15.134 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:44:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:44:15.135 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:44:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:44:15.135 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:44:15 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:44:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:44:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:16.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:44:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:16.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:17 np0005539510 nova_compute[231979]: 2025-11-29 06:44:17.440 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 3.77 sec#033[00m
Nov 29 01:44:17 np0005539510 nova_compute[231979]: 2025-11-29 06:44:17.459 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:44:17 np0005539510 nova_compute[231979]: 2025-11-29 06:44:17.459 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:17 np0005539510 nova_compute[231979]: 2025-11-29 06:44:17.459 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:17 np0005539510 nova_compute[231979]: 2025-11-29 06:44:17.460 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:17 np0005539510 nova_compute[231979]: 2025-11-29 06:44:17.460 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:17 np0005539510 nova_compute[231979]: 2025-11-29 06:44:17.460 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:17 np0005539510 nova_compute[231979]: 2025-11-29 06:44:17.460 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:17 np0005539510 nova_compute[231979]: 2025-11-29 06:44:17.461 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:44:17 np0005539510 nova_compute[231979]: 2025-11-29 06:44:17.461 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:44:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:18.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:44:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:18.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:20.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:20.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:20 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:44:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:22.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:22.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:24.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:44:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:24.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:44:25 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:44:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:26.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:26.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:26 np0005539510 podman[232878]: 2025-11-29 06:44:26.886660509 +0000 UTC m=+0.049935733 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 01:44:26 np0005539510 podman[232877]: 2025-11-29 06:44:26.920753964 +0000 UTC m=+0.082334203 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:44:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:44:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:28.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:44:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:28.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:29 np0005539510 podman[232922]: 2025-11-29 06:44:29.883682507 +0000 UTC m=+0.051152835 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=multipathd)
Nov 29 01:44:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:30.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:30.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:30 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:44:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:32.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:32.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:44:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:34.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:44:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:34.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:35 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:44:36 np0005539510 nova_compute[231979]: 2025-11-29 06:44:36.109 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:44:36 np0005539510 nova_compute[231979]: 2025-11-29 06:44:36.110 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:44:36 np0005539510 nova_compute[231979]: 2025-11-29 06:44:36.110 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:44:36 np0005539510 nova_compute[231979]: 2025-11-29 06:44:36.111 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:44:36 np0005539510 nova_compute[231979]: 2025-11-29 06:44:36.112 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:44:36 np0005539510 nova_compute[231979]: 2025-11-29 06:44:36.134 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 8.69 sec#033[00m
Nov 29 01:44:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:36.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:36.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:36 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:44:36 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1970020033' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:44:36 np0005539510 nova_compute[231979]: 2025-11-29 06:44:36.574 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:44:36 np0005539510 nova_compute[231979]: 2025-11-29 06:44:36.724 231983 WARNING nova.virt.libvirt.driver [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:44:36 np0005539510 nova_compute[231979]: 2025-11-29 06:44:36.725 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5332MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:44:36 np0005539510 nova_compute[231979]: 2025-11-29 06:44:36.725 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:44:36 np0005539510 nova_compute[231979]: 2025-11-29 06:44:36.726 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:44:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:44:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:38.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:44:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:44:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:38.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:44:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:44:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:40.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:44:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:40.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:40 np0005539510 nova_compute[231979]: 2025-11-29 06:44:40.485 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:44:40 np0005539510 nova_compute[231979]: 2025-11-29 06:44:40.485 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:44:40 np0005539510 nova_compute[231979]: 2025-11-29 06:44:40.543 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:44:40 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:44:40 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Nov 29 01:44:40 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:40.926786) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 01:44:40 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Nov 29 01:44:40 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398680926873, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 2761, "num_deletes": 509, "total_data_size": 6471047, "memory_usage": 6557936, "flush_reason": "Manual Compaction"}
Nov 29 01:44:40 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Nov 29 01:44:40 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398680959675, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 4239117, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 15313, "largest_seqno": 18069, "table_properties": {"data_size": 4228438, "index_size": 6469, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3141, "raw_key_size": 23481, "raw_average_key_size": 18, "raw_value_size": 4205271, "raw_average_value_size": 3380, "num_data_blocks": 289, "num_entries": 1244, "num_filter_entries": 1244, "num_deletions": 509, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764398425, "oldest_key_time": 1764398425, "file_creation_time": 1764398680, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:44:40 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 32896 microseconds, and 8263 cpu microseconds.
Nov 29 01:44:40 np0005539510 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:44:40 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:40.959713) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 4239117 bytes OK
Nov 29 01:44:40 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:40.959732) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Nov 29 01:44:40 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:40.961628) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Nov 29 01:44:40 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:40.961644) EVENT_LOG_v1 {"time_micros": 1764398680961640, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 01:44:40 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:40.961659) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 01:44:40 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 6458171, prev total WAL file size 6458171, number of live WAL files 2.
Nov 29 01:44:40 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:44:40 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:40.963147) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323535' seq:0, type:0; will stop at (end)
Nov 29 01:44:40 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 01:44:40 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(4139KB)], [30(9302KB)]
Nov 29 01:44:40 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398680963196, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 13764729, "oldest_snapshot_seqno": -1}
Nov 29 01:44:40 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:44:40 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/10673063' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:44:40 np0005539510 nova_compute[231979]: 2025-11-29 06:44:40.982 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:44:40 np0005539510 nova_compute[231979]: 2025-11-29 06:44:40.987 231983 DEBUG nova.compute.provider_tree [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed in ProviderTree for provider: 98b21ca7-b42c-4765-935a-26a89197ffb9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:44:41 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 4723 keys, 11178089 bytes, temperature: kUnknown
Nov 29 01:44:41 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398681043712, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 11178089, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11142641, "index_size": 22538, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11845, "raw_key_size": 118499, "raw_average_key_size": 25, "raw_value_size": 11053323, "raw_average_value_size": 2340, "num_data_blocks": 935, "num_entries": 4723, "num_filter_entries": 4723, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397158, "oldest_key_time": 0, "file_creation_time": 1764398680, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:44:41 np0005539510 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:44:41 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:41.044207) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 11178089 bytes
Nov 29 01:44:41 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:41.045670) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 170.5 rd, 138.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.0, 9.1 +0.0 blob) out(10.7 +0.0 blob), read-write-amplify(5.9) write-amplify(2.6) OK, records in: 5757, records dropped: 1034 output_compression: NoCompression
Nov 29 01:44:41 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:41.045697) EVENT_LOG_v1 {"time_micros": 1764398681045685, "job": 16, "event": "compaction_finished", "compaction_time_micros": 80730, "compaction_time_cpu_micros": 24703, "output_level": 6, "num_output_files": 1, "total_output_size": 11178089, "num_input_records": 5757, "num_output_records": 4723, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 01:44:41 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:44:41 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398681046629, "job": 16, "event": "table_file_deletion", "file_number": 32}
Nov 29 01:44:41 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:44:41 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398681048719, "job": 16, "event": "table_file_deletion", "file_number": 30}
Nov 29 01:44:41 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:40.963048) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:44:41 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:41.048765) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:44:41 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:41.048772) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:44:41 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:41.048774) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:44:41 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:41.048776) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:44:41 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:44:41.048778) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:44:41 np0005539510 nova_compute[231979]: 2025-11-29 06:44:41.298 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:44:41 np0005539510 nova_compute[231979]: 2025-11-29 06:44:41.300 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:44:41 np0005539510 nova_compute[231979]: 2025-11-29 06:44:41.300 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.574s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:44:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:44:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:42.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:44:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:44:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:42.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:44:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:44.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:44:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:44.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:44:45 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:44:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:46.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:46.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:48.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:48.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:50.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:44:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:50.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:44:50 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:44:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:52.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:52.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:54.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:54.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:55 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:44:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:44:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:56.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:44:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:56.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:57 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Nov 29 01:44:57 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1524599570' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 01:44:57 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Nov 29 01:44:57 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1524599570' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 01:44:57 np0005539510 podman[233101]: 2025-11-29 06:44:57.884374586 +0000 UTC m=+0.049064379 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:44:57 np0005539510 podman[233100]: 2025-11-29 06:44:57.914871117 +0000 UTC m=+0.081151372 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 01:44:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:44:58.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:44:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:44:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:44:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:44:58.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:00.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:00.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:00 np0005539510 podman[233145]: 2025-11-29 06:45:00.884891615 +0000 UTC m=+0.050208359 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:45:00 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:45:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:02.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:45:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:02.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.881314) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398702881437, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 459, "num_deletes": 251, "total_data_size": 609291, "memory_usage": 619112, "flush_reason": "Manual Compaction"}
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398702889173, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 401811, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18074, "largest_seqno": 18528, "table_properties": {"data_size": 399311, "index_size": 600, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6105, "raw_average_key_size": 18, "raw_value_size": 394334, "raw_average_value_size": 1209, "num_data_blocks": 28, "num_entries": 326, "num_filter_entries": 326, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764398682, "oldest_key_time": 1764398682, "file_creation_time": 1764398702, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 7907 microseconds, and 4279 cpu microseconds.
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.889239) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 401811 bytes OK
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.889271) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.891429) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.891460) EVENT_LOG_v1 {"time_micros": 1764398702891453, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.891481) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 606475, prev total WAL file size 606475, number of live WAL files 2.
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.892096) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031303034' seq:72057594037927935, type:22 .. '7061786F730031323536' seq:0, type:0; will stop at (end)
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(392KB)], [33(10MB)]
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398702892341, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 11579900, "oldest_snapshot_seqno": -1}
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 4538 keys, 9460634 bytes, temperature: kUnknown
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398702958888, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 9460634, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9427908, "index_size": 20264, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11397, "raw_key_size": 115326, "raw_average_key_size": 25, "raw_value_size": 9343174, "raw_average_value_size": 2058, "num_data_blocks": 832, "num_entries": 4538, "num_filter_entries": 4538, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397158, "oldest_key_time": 0, "file_creation_time": 1764398702, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.959146) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 9460634 bytes
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.960833) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 174.0 rd, 142.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 10.7 +0.0 blob) out(9.0 +0.0 blob), read-write-amplify(52.4) write-amplify(23.5) OK, records in: 5049, records dropped: 511 output_compression: NoCompression
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.960856) EVENT_LOG_v1 {"time_micros": 1764398702960846, "job": 18, "event": "compaction_finished", "compaction_time_micros": 66546, "compaction_time_cpu_micros": 19907, "output_level": 6, "num_output_files": 1, "total_output_size": 9460634, "num_input_records": 5049, "num_output_records": 4538, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398702961043, "job": 18, "event": "table_file_deletion", "file_number": 35}
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398702963208, "job": 18, "event": "table_file_deletion", "file_number": 33}
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.892013) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.963281) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.963285) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.963287) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.963289) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:45:02 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:45:02.963291) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:45:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:04.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:45:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:04.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:45:05 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:45:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:06.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:45:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:06.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:45:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:08.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:08.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:45:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:10.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:45:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:10.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:10 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:45:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:45:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:12.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:45:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:45:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:12.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:45:13 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:45:13 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:45:13 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:45:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:14.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:45:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:14.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:45:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:45:15.136 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:45:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:45:15.137 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:45:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:45:15.137 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:45:15 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:45:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:16.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:45:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:16.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:45:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:18.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:18.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:45:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:20.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:45:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:20.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:20 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:45:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:45:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:22.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:45:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:45:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:22.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:45:24 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:45:24 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:45:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:24.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:45:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:24.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:45:25 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:45:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:26.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:26.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:28.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:28.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:28 np0005539510 podman[233411]: 2025-11-29 06:45:28.895852698 +0000 UTC m=+0.058520032 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 01:45:28 np0005539510 podman[233410]: 2025-11-29 06:45:28.924687157 +0000 UTC m=+0.088344998 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 29 01:45:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:45:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:30.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:45:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:30.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:30 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:45:31 np0005539510 podman[233457]: 2025-11-29 06:45:31.910038571 +0000 UTC m=+0.074891558 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible)
Nov 29 01:45:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:32.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:32.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:45:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:34.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:45:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:45:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:34.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:45:35 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:45:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:36.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:36.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:38.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:45:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:38.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:45:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:40.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:45:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:40.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:45:40 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:45:41 np0005539510 nova_compute[231979]: 2025-11-29 06:45:41.292 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:41 np0005539510 nova_compute[231979]: 2025-11-29 06:45:41.292 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:42.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:42.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:44.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:45:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:44.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:45:45 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:45:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:46.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:46.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:48.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:48.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:45:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:50.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:45:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:50.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:50 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:45:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:52.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:45:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:52.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:45:52 np0005539510 nova_compute[231979]: 2025-11-29 06:45:52.881 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:52 np0005539510 nova_compute[231979]: 2025-11-29 06:45:52.881 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:45:52 np0005539510 nova_compute[231979]: 2025-11-29 06:45:52.881 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:45:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:54.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:45:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:54.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:45:55 np0005539510 nova_compute[231979]: 2025-11-29 06:45:55.564 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:45:55 np0005539510 nova_compute[231979]: 2025-11-29 06:45:55.565 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:55 np0005539510 nova_compute[231979]: 2025-11-29 06:45:55.566 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:55 np0005539510 nova_compute[231979]: 2025-11-29 06:45:55.566 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:55 np0005539510 nova_compute[231979]: 2025-11-29 06:45:55.567 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:55 np0005539510 nova_compute[231979]: 2025-11-29 06:45:55.567 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:55 np0005539510 nova_compute[231979]: 2025-11-29 06:45:55.567 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:55 np0005539510 nova_compute[231979]: 2025-11-29 06:45:55.568 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:45:55 np0005539510 nova_compute[231979]: 2025-11-29 06:45:55.568 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:55 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:45:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:56.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:56.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:45:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:45:58.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:45:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:45:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:45:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:45:58.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:45:59 np0005539510 podman[233591]: 2025-11-29 06:45:59.925539954 +0000 UTC m=+0.085809310 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 01:45:59 np0005539510 podman[233592]: 2025-11-29 06:45:59.934658567 +0000 UTC m=+0.087384722 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 01:46:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:00.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:46:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:00.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:46:00 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:46:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:02.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:02.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:02 np0005539510 podman[233637]: 2025-11-29 06:46:02.890988708 +0000 UTC m=+0.057265939 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 01:46:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:04.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:04.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:05 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:46:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:06.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:46:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:06.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:46:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:08.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:08.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:09 np0005539510 nova_compute[231979]: 2025-11-29 06:46:09.007 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:46:09 np0005539510 nova_compute[231979]: 2025-11-29 06:46:09.007 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:46:09 np0005539510 nova_compute[231979]: 2025-11-29 06:46:09.007 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:46:09 np0005539510 nova_compute[231979]: 2025-11-29 06:46:09.007 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:46:09 np0005539510 nova_compute[231979]: 2025-11-29 06:46:09.008 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:46:09 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:46:09 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2861061763' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:46:09 np0005539510 nova_compute[231979]: 2025-11-29 06:46:09.430 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:46:09 np0005539510 nova_compute[231979]: 2025-11-29 06:46:09.555 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 3.42 sec#033[00m
Nov 29 01:46:09 np0005539510 nova_compute[231979]: 2025-11-29 06:46:09.589 231983 WARNING nova.virt.libvirt.driver [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:46:09 np0005539510 nova_compute[231979]: 2025-11-29 06:46:09.590 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5304MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:46:09 np0005539510 nova_compute[231979]: 2025-11-29 06:46:09.591 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:46:09 np0005539510 nova_compute[231979]: 2025-11-29 06:46:09.591 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:46:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:10.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:10.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:10 np0005539510 nova_compute[231979]: 2025-11-29 06:46:10.593 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:46:10 np0005539510 nova_compute[231979]: 2025-11-29 06:46:10.593 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:46:10 np0005539510 nova_compute[231979]: 2025-11-29 06:46:10.631 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:46:10 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:46:11 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:46:11 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/616557685' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:46:11 np0005539510 nova_compute[231979]: 2025-11-29 06:46:11.054 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:46:11 np0005539510 nova_compute[231979]: 2025-11-29 06:46:11.059 231983 DEBUG nova.compute.provider_tree [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed in ProviderTree for provider: 98b21ca7-b42c-4765-935a-26a89197ffb9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:46:11 np0005539510 nova_compute[231979]: 2025-11-29 06:46:11.865 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:46:11 np0005539510 nova_compute[231979]: 2025-11-29 06:46:11.867 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:46:11 np0005539510 nova_compute[231979]: 2025-11-29 06:46:11.867 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.276s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:46:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:46:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:12.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:46:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:12.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:14.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:14.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:46:15.138 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:46:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:46:15.139 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:46:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:46:15.139 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:46:15 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:46:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:16.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:46:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:16.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:46:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:18.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:18.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:46:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:20.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:46:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:46:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:20.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:46:20 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:46:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:22.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:46:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:22.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:46:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:24.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:46:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:24.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:46:25 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:46:25 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:46:25 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:46:25 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:46:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:26.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:26.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:28.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:28.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:30.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:30.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:30 np0005539510 podman[233899]: 2025-11-29 06:46:30.898850463 +0000 UTC m=+0.058512442 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 01:46:30 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:46:30 np0005539510 podman[233898]: 2025-11-29 06:46:30.924731933 +0000 UTC m=+0.086982471 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 01:46:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:32.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:32.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:33 np0005539510 podman[233966]: 2025-11-29 06:46:33.21815158 +0000 UTC m=+0.054921057 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Nov 29 01:46:33 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:46:33 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:46:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:34.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:34.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:35 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:46:35.706 143385 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '72:05:03', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'aa:d2:09:dd:a5:e1'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:46:35 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:46:35.707 143385 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 01:46:35 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:46:35.708 143385 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=fa6f2e5a-176a-4b37-8b2a-5aaf74119c47, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:46:35 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:46:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:46:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:36.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:46:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:36.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:38.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda3423c6f0 =====
Nov 29 01:46:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda3423c6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:38 np0005539510 radosgw[83467]: beast: 0x7fda3423c6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:38.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:40.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:46:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:40.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:46:40 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:46:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:42.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:46:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:42.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:46:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:44.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:44.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:45 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:46:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:46.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:46.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:46:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:48.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:46:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:48.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:50.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:50.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:50 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:46:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:52.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:52.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:54.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:54.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:55 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:46:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:46:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:56.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:46:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:46:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:56.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:46:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:46:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:46:58.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:46:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:46:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:46:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:46:58.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:47:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:47:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:00.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:47:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:00.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:00 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:47:01 np0005539510 podman[234127]: 2025-11-29 06:47:01.885661555 +0000 UTC m=+0.050208151 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 29 01:47:01 np0005539510 podman[234126]: 2025-11-29 06:47:01.916766145 +0000 UTC m=+0.083790307 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 29 01:47:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:47:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:02.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:47:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:02.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:03 np0005539510 podman[234176]: 2025-11-29 06:47:03.884410431 +0000 UTC m=+0.051498675 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:47:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:04.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:04.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:05 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:47:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:47:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:06.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:47:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:06.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:08.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:08.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:10.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:10.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:10 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:47:11 np0005539510 nova_compute[231979]: 2025-11-29 06:47:11.870 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:11 np0005539510 nova_compute[231979]: 2025-11-29 06:47:11.870 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:11 np0005539510 nova_compute[231979]: 2025-11-29 06:47:11.870 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:47:11 np0005539510 nova_compute[231979]: 2025-11-29 06:47:11.870 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:47:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:47:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:12.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:47:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:47:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:12.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:47:13 np0005539510 nova_compute[231979]: 2025-11-29 06:47:13.712 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:47:13 np0005539510 nova_compute[231979]: 2025-11-29 06:47:13.713 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:13 np0005539510 nova_compute[231979]: 2025-11-29 06:47:13.713 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:13 np0005539510 nova_compute[231979]: 2025-11-29 06:47:13.713 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:13 np0005539510 nova_compute[231979]: 2025-11-29 06:47:13.714 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:13 np0005539510 nova_compute[231979]: 2025-11-29 06:47:13.714 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:13 np0005539510 nova_compute[231979]: 2025-11-29 06:47:13.714 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:13 np0005539510 nova_compute[231979]: 2025-11-29 06:47:13.714 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:47:13 np0005539510 nova_compute[231979]: 2025-11-29 06:47:13.714 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:14 np0005539510 nova_compute[231979]: 2025-11-29 06:47:14.060 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:14 np0005539510 nova_compute[231979]: 2025-11-29 06:47:14.060 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:14 np0005539510 nova_compute[231979]: 2025-11-29 06:47:14.061 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:14 np0005539510 nova_compute[231979]: 2025-11-29 06:47:14.061 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:47:14 np0005539510 nova_compute[231979]: 2025-11-29 06:47:14.061 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:47:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:14.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:14 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:47:14 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3335706329' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:47:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:14.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:14 np0005539510 nova_compute[231979]: 2025-11-29 06:47:14.529 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:47:14 np0005539510 nova_compute[231979]: 2025-11-29 06:47:14.692 231983 WARNING nova.virt.libvirt.driver [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:47:14 np0005539510 nova_compute[231979]: 2025-11-29 06:47:14.694 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5328MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:47:14 np0005539510 nova_compute[231979]: 2025-11-29 06:47:14.694 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:14 np0005539510 nova_compute[231979]: 2025-11-29 06:47:14.694 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:47:15.139 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:47:15.140 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:47:15.140 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:15 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:47:15 np0005539510 nova_compute[231979]: 2025-11-29 06:47:15.955 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:47:15 np0005539510 nova_compute[231979]: 2025-11-29 06:47:15.955 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:47:15 np0005539510 nova_compute[231979]: 2025-11-29 06:47:15.991 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:47:16 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:47:16 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/209762820' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:47:16 np0005539510 nova_compute[231979]: 2025-11-29 06:47:16.448 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:47:16 np0005539510 nova_compute[231979]: 2025-11-29 06:47:16.454 231983 DEBUG nova.compute.provider_tree [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed in ProviderTree for provider: 98b21ca7-b42c-4765-935a-26a89197ffb9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:47:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:16.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:16.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:16 np0005539510 nova_compute[231979]: 2025-11-29 06:47:16.814 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:47:16 np0005539510 nova_compute[231979]: 2025-11-29 06:47:16.816 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:47:16 np0005539510 nova_compute[231979]: 2025-11-29 06:47:16.816 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:17 np0005539510 nova_compute[231979]: 2025-11-29 06:47:17.802 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:17 np0005539510 nova_compute[231979]: 2025-11-29 06:47:17.802 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:17 np0005539510 nova_compute[231979]: 2025-11-29 06:47:17.833 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:17 np0005539510 nova_compute[231979]: 2025-11-29 06:47:17.833 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:47:17 np0005539510 nova_compute[231979]: 2025-11-29 06:47:17.833 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:47:17 np0005539510 nova_compute[231979]: 2025-11-29 06:47:17.859 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:47:17 np0005539510 nova_compute[231979]: 2025-11-29 06:47:17.860 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:17 np0005539510 nova_compute[231979]: 2025-11-29 06:47:17.860 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:17 np0005539510 nova_compute[231979]: 2025-11-29 06:47:17.860 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:17 np0005539510 nova_compute[231979]: 2025-11-29 06:47:17.861 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:17 np0005539510 nova_compute[231979]: 2025-11-29 06:47:17.861 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:17 np0005539510 nova_compute[231979]: 2025-11-29 06:47:17.861 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:47:17 np0005539510 nova_compute[231979]: 2025-11-29 06:47:17.861 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:18 np0005539510 nova_compute[231979]: 2025-11-29 06:47:18.218 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:18 np0005539510 nova_compute[231979]: 2025-11-29 06:47:18.219 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:18 np0005539510 nova_compute[231979]: 2025-11-29 06:47:18.219 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:18 np0005539510 nova_compute[231979]: 2025-11-29 06:47:18.219 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:47:18 np0005539510 nova_compute[231979]: 2025-11-29 06:47:18.220 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:47:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:18.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:18.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:18 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:47:18 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/687770592' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:47:18 np0005539510 nova_compute[231979]: 2025-11-29 06:47:18.682 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:47:18 np0005539510 nova_compute[231979]: 2025-11-29 06:47:18.846 231983 WARNING nova.virt.libvirt.driver [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:47:18 np0005539510 nova_compute[231979]: 2025-11-29 06:47:18.847 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5281MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:47:18 np0005539510 nova_compute[231979]: 2025-11-29 06:47:18.847 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:18 np0005539510 nova_compute[231979]: 2025-11-29 06:47:18.848 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:19 np0005539510 nova_compute[231979]: 2025-11-29 06:47:19.361 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:47:19 np0005539510 nova_compute[231979]: 2025-11-29 06:47:19.362 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:47:19 np0005539510 nova_compute[231979]: 2025-11-29 06:47:19.381 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:47:19 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:47:19 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/728103463' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:47:19 np0005539510 nova_compute[231979]: 2025-11-29 06:47:19.867 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:47:19 np0005539510 nova_compute[231979]: 2025-11-29 06:47:19.872 231983 DEBUG nova.compute.provider_tree [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed in ProviderTree for provider: 98b21ca7-b42c-4765-935a-26a89197ffb9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:47:20 np0005539510 nova_compute[231979]: 2025-11-29 06:47:20.000 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:47:20 np0005539510 nova_compute[231979]: 2025-11-29 06:47:20.002 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:47:20 np0005539510 nova_compute[231979]: 2025-11-29 06:47:20.003 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:47:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:20.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:47:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:20.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:20 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:47:21 np0005539510 nova_compute[231979]: 2025-11-29 06:47:21.003 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:47:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:22.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:47:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:22.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:24.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:24.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:25 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:47:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:26.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:26.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:28.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:28.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:30.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:47:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:30.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:47:30 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:47:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:32.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:32.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:32 np0005539510 podman[234351]: 2025-11-29 06:47:32.914870034 +0000 UTC m=+0.067242154 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 01:47:32 np0005539510 podman[234350]: 2025-11-29 06:47:32.922763252 +0000 UTC m=+0.087498888 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller)
Nov 29 01:47:34 np0005539510 podman[234553]: 2025-11-29 06:47:34.366966985 +0000 UTC m=+0.096009563 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 01:47:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.002000053s ======
Nov 29 01:47:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:34.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Nov 29 01:47:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:47:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:34.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:47:35 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:47:35 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:47:35 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:47:35 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:47:35 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:47:35 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:47:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:47:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:36.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:47:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:36.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:47:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:38.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:47:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:47:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:38.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:47:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:40.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:40.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:40 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:47:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:42.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:42.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:43 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:47:43 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:47:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:44.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:47:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:44.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:47:45 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:47:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:46.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:46.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:47:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:48.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:47:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:48.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:50.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:50.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:50 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:47:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:52.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:47:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:52.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:47:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:54.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:54.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:55 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:47:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:47:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:56.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:47:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:47:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:56.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.076353) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398877076425, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 1875, "num_deletes": 251, "total_data_size": 4554771, "memory_usage": 4605256, "flush_reason": "Manual Compaction"}
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398877090092, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 1737351, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18533, "largest_seqno": 20403, "table_properties": {"data_size": 1731823, "index_size": 2668, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 14351, "raw_average_key_size": 20, "raw_value_size": 1719513, "raw_average_value_size": 2428, "num_data_blocks": 123, "num_entries": 708, "num_filter_entries": 708, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764398703, "oldest_key_time": 1764398703, "file_creation_time": 1764398877, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 13764 microseconds, and 4941 cpu microseconds.
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.090126) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 1737351 bytes OK
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.090142) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.092088) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.092102) EVENT_LOG_v1 {"time_micros": 1764398877092098, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.092115) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 4546375, prev total WAL file size 4546375, number of live WAL files 2.
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.092978) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400353033' seq:72057594037927935, type:22 .. '6D67727374617400373535' seq:0, type:0; will stop at (end)
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(1696KB)], [36(9238KB)]
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398877092997, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 11197985, "oldest_snapshot_seqno": -1}
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 4812 keys, 8586067 bytes, temperature: kUnknown
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398877153042, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 8586067, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8553880, "index_size": 19085, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12037, "raw_key_size": 121250, "raw_average_key_size": 25, "raw_value_size": 8466690, "raw_average_value_size": 1759, "num_data_blocks": 783, "num_entries": 4812, "num_filter_entries": 4812, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397158, "oldest_key_time": 0, "file_creation_time": 1764398877, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.153417) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 8586067 bytes
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.154600) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 185.9 rd, 142.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 9.0 +0.0 blob) out(8.2 +0.0 blob), read-write-amplify(11.4) write-amplify(4.9) OK, records in: 5246, records dropped: 434 output_compression: NoCompression
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.154617) EVENT_LOG_v1 {"time_micros": 1764398877154609, "job": 20, "event": "compaction_finished", "compaction_time_micros": 60221, "compaction_time_cpu_micros": 20626, "output_level": 6, "num_output_files": 1, "total_output_size": 8586067, "num_input_records": 5246, "num_output_records": 4812, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398877155640, "job": 20, "event": "table_file_deletion", "file_number": 38}
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398877157538, "job": 20, "event": "table_file_deletion", "file_number": 36}
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.092938) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.157684) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.157688) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.157689) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.157691) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:47:57 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:47:57.157692) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:47:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:47:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:47:58.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:47:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:47:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:47:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:47:58.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:47:59 np0005539510 nova_compute[231979]: 2025-11-29 06:47:59.100 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 9.56 sec#033[00m
Nov 29 01:48:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:48:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:00.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:48:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:00.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:00 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:48:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:48:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:02.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:48:02 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Nov 29 01:48:02 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4080925924' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 01:48:02 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Nov 29 01:48:02 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4080925924' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 01:48:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:02.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:03 np0005539510 podman[234718]: 2025-11-29 06:48:03.888913492 +0000 UTC m=+0.050942485 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:48:03 np0005539510 podman[234717]: 2025-11-29 06:48:03.91274735 +0000 UTC m=+0.078230844 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 01:48:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:04.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:04.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:04 np0005539510 podman[234759]: 2025-11-29 06:48:04.919877947 +0000 UTC m=+0.083123483 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:48:05 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:48:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:48:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:06.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:48:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:06.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:08.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:08.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:10.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:10.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:10 np0005539510 nova_compute[231979]: 2025-11-29 06:48:10.723 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 1.62 sec#033[00m
Nov 29 01:48:10 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:48:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:12.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:12.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:12 np0005539510 nova_compute[231979]: 2025-11-29 06:48:12.861 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:48:12 np0005539510 nova_compute[231979]: 2025-11-29 06:48:12.862 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 01:48:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:14.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:14.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:48:15.141 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:48:15.142 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:48:15.142 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:15 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:48:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:48:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:16.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:48:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:48:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:16.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:48:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:48:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:18.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:48:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:18.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:20.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:48:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:20.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:48:20 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:48:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:48:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:22.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:48:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:48:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:22.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:48:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:24.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:48:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:24.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:48:25 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:48:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:48:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:26.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:48:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:26.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:48:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:28.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:48:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:28.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:28 np0005539510 nova_compute[231979]: 2025-11-29 06:48:28.747 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 01:48:28 np0005539510 nova_compute[231979]: 2025-11-29 06:48:28.748 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:48:28 np0005539510 nova_compute[231979]: 2025-11-29 06:48:28.748 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 01:48:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:48:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:30.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:48:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:30.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:30 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:48:31 np0005539510 nova_compute[231979]: 2025-11-29 06:48:31.889 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 11.17 sec#033[00m
Nov 29 01:48:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:32.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:32.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:34.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:34.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:34 np0005539510 podman[234871]: 2025-11-29 06:48:34.887896125 +0000 UTC m=+0.047909495 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:48:34 np0005539510 podman[234869]: 2025-11-29 06:48:34.918763078 +0000 UTC m=+0.079909928 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:48:35 np0005539510 podman[234942]: 2025-11-29 06:48:35.88623943 +0000 UTC m=+0.053582604 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 01:48:35 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:48:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:36.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:36.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:48:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:38.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:48:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:38.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:40.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:40.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:40 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:48:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:42.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:42.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:44.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:48:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:44.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.076911) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398925076984, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 711, "num_deletes": 251, "total_data_size": 1330634, "memory_usage": 1350960, "flush_reason": "Manual Compaction"}
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398925084585, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 878455, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20409, "largest_seqno": 21114, "table_properties": {"data_size": 874937, "index_size": 1426, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 7933, "raw_average_key_size": 19, "raw_value_size": 867897, "raw_average_value_size": 2121, "num_data_blocks": 62, "num_entries": 409, "num_filter_entries": 409, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764398877, "oldest_key_time": 1764398877, "file_creation_time": 1764398925, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 7706 microseconds, and 3291 cpu microseconds.
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.084628) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 878455 bytes OK
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.084645) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.086081) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.086094) EVENT_LOG_v1 {"time_micros": 1764398925086090, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.086109) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 1326846, prev total WAL file size 1326846, number of live WAL files 2.
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.086579) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031323535' seq:72057594037927935, type:22 .. '7061786F730031353037' seq:0, type:0; will stop at (end)
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(857KB)], [39(8384KB)]
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398925086609, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 9464522, "oldest_snapshot_seqno": -1}
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 4703 keys, 7370470 bytes, temperature: kUnknown
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398925137967, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 7370470, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7340060, "index_size": 17564, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11781, "raw_key_size": 119556, "raw_average_key_size": 25, "raw_value_size": 7255747, "raw_average_value_size": 1542, "num_data_blocks": 714, "num_entries": 4703, "num_filter_entries": 4703, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397158, "oldest_key_time": 0, "file_creation_time": 1764398925, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.138227) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 7370470 bytes
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.139626) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 183.9 rd, 143.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 8.2 +0.0 blob) out(7.0 +0.0 blob), read-write-amplify(19.2) write-amplify(8.4) OK, records in: 5221, records dropped: 518 output_compression: NoCompression
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.139646) EVENT_LOG_v1 {"time_micros": 1764398925139637, "job": 22, "event": "compaction_finished", "compaction_time_micros": 51458, "compaction_time_cpu_micros": 15666, "output_level": 6, "num_output_files": 1, "total_output_size": 7370470, "num_input_records": 5221, "num_output_records": 4703, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398925139943, "job": 22, "event": "table_file_deletion", "file_number": 41}
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764398925141696, "job": 22, "event": "table_file_deletion", "file_number": 39}
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.086527) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.141729) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.141733) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.141735) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.141736) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:48:45.141739) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:48:45 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:48:46 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:48:46 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:48:46 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:48:46 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:48:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:48:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:46.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:48:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:46.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:48.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:48.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:50 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:48:50 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:48:50 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:48:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:48:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:50.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:48:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:50.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:50 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:48:51 np0005539510 nova_compute[231979]: 2025-11-29 06:48:51.093 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:48:51 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:48:51 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:48:52 np0005539510 nova_compute[231979]: 2025-11-29 06:48:52.021 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 10.13 sec#033[00m
Nov 29 01:48:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:52.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:48:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:52.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:48:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:54.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:54.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:55 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:48:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:56.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:56.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:48:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:48:58.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:48:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:48:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:48:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:48:58.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:48:59 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:48:59 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:49:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:00.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:00.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:00 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:49:01 np0005539510 nova_compute[231979]: 2025-11-29 06:49:01.822 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:01 np0005539510 nova_compute[231979]: 2025-11-29 06:49:01.823 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:01 np0005539510 nova_compute[231979]: 2025-11-29 06:49:01.823 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:49:01 np0005539510 nova_compute[231979]: 2025-11-29 06:49:01.823 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:49:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:49:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:02.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:49:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:02.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:49:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:04.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:49:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:04.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:05 np0005539510 podman[235327]: 2025-11-29 06:49:05.903774771 +0000 UTC m=+0.059671634 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 29 01:49:05 np0005539510 podman[235326]: 2025-11-29 06:49:05.932710274 +0000 UTC m=+0.088636338 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible)
Nov 29 01:49:05 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:49:05 np0005539510 podman[235367]: 2025-11-29 06:49:05.995023297 +0000 UTC m=+0.057841256 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 01:49:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:06.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:06.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:08.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:08.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:10.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:10.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:10 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:49:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:12.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:49:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:12.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:49:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:14.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:49:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:14.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:49:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:49:15.143 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:49:15.143 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:49:15.143 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:15 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:49:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:16.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:16.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:18.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:18 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 01:49:18 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.0 total, 600.0 interval#012Cumulative writes: 3893 writes, 21K keys, 3893 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.03 MB/s#012Cumulative WAL: 3893 writes, 3893 syncs, 1.00 writes per sync, written: 0.05 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1412 writes, 7260 keys, 1412 commit groups, 1.0 writes per commit group, ingest: 15.00 MB, 0.02 MB/s#012Interval WAL: 1412 writes, 1412 syncs, 1.00 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     32.4      0.81              0.07        11    0.074       0      0       0.0       0.0#012  L6      1/0    7.03 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.5    120.7     99.7      0.91              0.24        10    0.091     49K   5241       0.0       0.0#012 Sum      1/0    7.03 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.5     63.9     68.0      1.72              0.30        21    0.082     49K   5241       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   6.7    131.7    125.5      0.49              0.17        12    0.041     31K   3467       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    120.7     99.7      0.91              0.24        10    0.091     49K   5241       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     32.4      0.81              0.07        10    0.081       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1800.0 total, 600.0 interval#012Flush(GB): cumulative 0.026, interval 0.009#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.11 GB write, 0.06 MB/s write, 0.11 GB read, 0.06 MB/s read, 1.7 seconds#012Interval compaction: 0.06 GB write, 0.10 MB/s write, 0.06 GB read, 0.11 MB/s read, 0.5 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55be896f31f0#2 capacity: 304.00 MB usage: 8.06 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 6.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(419,7.66 MB,2.52095%) FilterBlock(21,140.86 KB,0.0452493%) IndexBlock(21,268.67 KB,0.0863075%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Nov 29 01:49:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:18.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:20.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:20.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:20 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:49:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:49:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:22.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:49:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:22.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:24.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:24.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:25 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:49:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:26.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:49:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:26.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:49:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:49:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:28.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:49:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:49:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:28.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:49:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:49:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:30.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:49:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:30.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:30 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:49:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:49:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:32.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:49:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:32.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:34 np0005539510 nova_compute[231979]: 2025-11-29 06:49:34.164 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:49:34 np0005539510 nova_compute[231979]: 2025-11-29 06:49:34.165 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:34 np0005539510 nova_compute[231979]: 2025-11-29 06:49:34.165 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:34 np0005539510 nova_compute[231979]: 2025-11-29 06:49:34.165 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:34 np0005539510 nova_compute[231979]: 2025-11-29 06:49:34.165 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:34 np0005539510 nova_compute[231979]: 2025-11-29 06:49:34.165 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:34 np0005539510 nova_compute[231979]: 2025-11-29 06:49:34.166 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:34 np0005539510 nova_compute[231979]: 2025-11-29 06:49:34.166 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:49:34 np0005539510 nova_compute[231979]: 2025-11-29 06:49:34.166 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:34.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:34.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:35 np0005539510 nova_compute[231979]: 2025-11-29 06:49:35.461 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 23.44 sec#033[00m
Nov 29 01:49:35 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:49:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:49:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:36.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:49:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:49:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:36.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:49:36 np0005539510 podman[235510]: 2025-11-29 06:49:36.900214088 +0000 UTC m=+0.058016661 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 01:49:36 np0005539510 podman[235509]: 2025-11-29 06:49:36.902219981 +0000 UTC m=+0.061140834 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 01:49:36 np0005539510 podman[235508]: 2025-11-29 06:49:36.917130184 +0000 UTC m=+0.082409074 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:49:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:38.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:49:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:38.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:49:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:40.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:40.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:40 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:49:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:42.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:42.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:44.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:44.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:45 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:49:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:46.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:46.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:48 np0005539510 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 01:49:48 np0005539510 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 6280 writes, 25K keys, 6280 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 6280 writes, 1161 syncs, 5.41 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 484 writes, 738 keys, 484 commit groups, 1.0 writes per commit group, ingest: 0.24 MB, 0.00 MB/s#012Interval WAL: 484 writes, 238 syncs, 2.03 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 29 01:49:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:48.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:48.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:49:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:50.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:49:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:50.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:50 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:49:51 np0005539510 nova_compute[231979]: 2025-11-29 06:49:51.116 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:51 np0005539510 nova_compute[231979]: 2025-11-29 06:49:51.116 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:51 np0005539510 nova_compute[231979]: 2025-11-29 06:49:51.117 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:51 np0005539510 nova_compute[231979]: 2025-11-29 06:49:51.117 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:49:51 np0005539510 nova_compute[231979]: 2025-11-29 06:49:51.117 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:51 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:49:51 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2261730173' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:49:51 np0005539510 nova_compute[231979]: 2025-11-29 06:49:51.534 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.417s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:51 np0005539510 nova_compute[231979]: 2025-11-29 06:49:51.681 231983 WARNING nova.virt.libvirt.driver [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:49:51 np0005539510 nova_compute[231979]: 2025-11-29 06:49:51.682 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5312MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:49:51 np0005539510 nova_compute[231979]: 2025-11-29 06:49:51.682 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:51 np0005539510 nova_compute[231979]: 2025-11-29 06:49:51.683 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:52 np0005539510 ceph-mgr[77504]: client.0 ms_handle_reset on v2:192.168.122.100:6800/1221624088
Nov 29 01:49:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:52.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:52.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:54.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:54.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:55 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:49:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:56.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:56.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:49:58.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:49:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:49:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:49:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:49:58.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:00.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:00 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:50:00 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:50:00 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:50:00 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:50:00 np0005539510 ceph-mon[77142]: overall HEALTH_OK
Nov 29 01:50:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:50:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:00.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:50:00 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:50:02 np0005539510 nova_compute[231979]: 2025-11-29 06:50:02.207 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 16.75 sec#033[00m
Nov 29 01:50:02 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Nov 29 01:50:02 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2319096117' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 01:50:02 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Nov 29 01:50:02 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2319096117' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 01:50:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:02.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:02.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:04.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:04.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:05 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:50:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:50:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:06.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:50:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:06.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:07 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:50:07 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:50:07 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:50:07 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:50:07 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:50:07 np0005539510 podman[235791]: 2025-11-29 06:50:07.916849596 +0000 UTC m=+0.077880319 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 29 01:50:07 np0005539510 podman[235790]: 2025-11-29 06:50:07.922186259 +0000 UTC m=+0.085458862 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 01:50:07 np0005539510 podman[235792]: 2025-11-29 06:50:07.92221843 +0000 UTC m=+0.080458628 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd)
Nov 29 01:50:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:08.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:08.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:10.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:50:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:10.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:50:10 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:50:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:12.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:12.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:14.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:14.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:50:15.144 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:50:15.144 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:50:15.144 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:15 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:50:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:16.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:16.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:18.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:18.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:19 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:50:19 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:50:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:20.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:20.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:20 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:50:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:22.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:50:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:22.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:50:22 np0005539510 nova_compute[231979]: 2025-11-29 06:50:22.839 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 10.63 sec#033[00m
Nov 29 01:50:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:24.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:24.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:25 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:50:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:50:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:26.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:50:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:26.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:28.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:28.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:50:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:30.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:50:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:50:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:30.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:50:30 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:50:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:50:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:32.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:50:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:50:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:32.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:50:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:34.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:34.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:35 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:50:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:36.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:36.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:50:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:38.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:50:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:38.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:38 np0005539510 podman[236023]: 2025-11-29 06:50:38.893957828 +0000 UTC m=+0.055692474 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:50:38 np0005539510 podman[236024]: 2025-11-29 06:50:38.901571352 +0000 UTC m=+0.060788081 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 01:50:38 np0005539510 podman[236022]: 2025-11-29 06:50:38.924191379 +0000 UTC m=+0.086464430 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 29 01:50:39 np0005539510 nova_compute[231979]: 2025-11-29 06:50:39.569 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:50:39 np0005539510 nova_compute[231979]: 2025-11-29 06:50:39.570 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:50:39 np0005539510 nova_compute[231979]: 2025-11-29 06:50:39.588 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Refreshing inventories for resource provider 98b21ca7-b42c-4765-935a-26a89197ffb9 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 01:50:39 np0005539510 nova_compute[231979]: 2025-11-29 06:50:39.604 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Updating ProviderTree inventory for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 01:50:39 np0005539510 nova_compute[231979]: 2025-11-29 06:50:39.605 231983 DEBUG nova.compute.provider_tree [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Updating inventory in ProviderTree for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 01:50:39 np0005539510 nova_compute[231979]: 2025-11-29 06:50:39.640 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Refreshing aggregate associations for resource provider 98b21ca7-b42c-4765-935a-26a89197ffb9, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 01:50:39 np0005539510 nova_compute[231979]: 2025-11-29 06:50:39.680 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Refreshing trait associations for resource provider 98b21ca7-b42c-4765-935a-26a89197ffb9, traits: COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE2,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 01:50:39 np0005539510 nova_compute[231979]: 2025-11-29 06:50:39.725 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:40 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:50:40 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3662469046' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:50:40 np0005539510 nova_compute[231979]: 2025-11-29 06:50:40.351 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.627s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:40 np0005539510 nova_compute[231979]: 2025-11-29 06:50:40.358 231983 DEBUG nova.compute.provider_tree [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed in ProviderTree for provider: 98b21ca7-b42c-4765-935a-26a89197ffb9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:50:40 np0005539510 nova_compute[231979]: 2025-11-29 06:50:40.415 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 7.58 sec#033[00m
Nov 29 01:50:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:40.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:40.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:40 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:50:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:50:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:42.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:50:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:42.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:44.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:44.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:45 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:50:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:50:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:46.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:50:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:46.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:48.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:48.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:50.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:50.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:50 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:50:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:52.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:52.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:54.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:50:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:54.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:50:55 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:50:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:56.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:56.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:50:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:50:58.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:50:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:50:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:50:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:50:58.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:51:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:51:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:00.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:51:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:00.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:00 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:51:02 np0005539510 radosgw[83467]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Nov 29 01:51:02 np0005539510 radosgw[83467]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Nov 29 01:51:02 np0005539510 radosgw[83467]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Nov 29 01:51:02 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Nov 29 01:51:02 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3119268237' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 01:51:02 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Nov 29 01:51:02 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3119268237' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 01:51:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:02.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:51:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:02.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:51:03 np0005539510 radosgw[83467]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Nov 29 01:51:03 np0005539510 radosgw[83467]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Nov 29 01:51:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:04.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:04.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:05 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:51:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:06.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:06.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:08.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:08.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:09 np0005539510 podman[236177]: 2025-11-29 06:51:09.899108305 +0000 UTC m=+0.052607192 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:51:09 np0005539510 podman[236178]: 2025-11-29 06:51:09.910061698 +0000 UTC m=+0.057341818 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 01:51:09 np0005539510 podman[236176]: 2025-11-29 06:51:09.940664929 +0000 UTC m=+0.087053935 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 01:51:10 np0005539510 nova_compute[231979]: 2025-11-29 06:51:10.198 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:51:10 np0005539510 nova_compute[231979]: 2025-11-29 06:51:10.199 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:51:10 np0005539510 nova_compute[231979]: 2025-11-29 06:51:10.200 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 78.517s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:51:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:10.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:51:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:10.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:10 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:51:11 np0005539510 nova_compute[231979]: 2025-11-29 06:51:11.695 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 21.28 sec#033[00m
Nov 29 01:51:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:12.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:12.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:14.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:14.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:51:15.145 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:51:15.146 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:51:15.146 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:15 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:51:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:16.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:16.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:51:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:18.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:51:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:18.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:20 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:51:20 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:51:20 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Nov 29 01:51:20 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Nov 29 01:51:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:20.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:20 np0005539510 podman[236687]: 2025-11-29 06:51:20.772471242 +0000 UTC m=+0.042598593 container create d7f10d0da8f49b7c24942f100db25e87344250561a2fd981c24c8d06b166999d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_neumann, CEPH_REF=reef, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 29 01:51:20 np0005539510 systemd[1]: Started libpod-conmon-d7f10d0da8f49b7c24942f100db25e87344250561a2fd981c24c8d06b166999d.scope.
Nov 29 01:51:20 np0005539510 systemd[1]: Started libcrun container.
Nov 29 01:51:20 np0005539510 podman[236687]: 2025-11-29 06:51:20.752657691 +0000 UTC m=+0.022785062 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:51:20 np0005539510 podman[236687]: 2025-11-29 06:51:20.865017714 +0000 UTC m=+0.135145075 container init d7f10d0da8f49b7c24942f100db25e87344250561a2fd981c24c8d06b166999d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_neumann, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 01:51:20 np0005539510 podman[236687]: 2025-11-29 06:51:20.877279522 +0000 UTC m=+0.147406863 container start d7f10d0da8f49b7c24942f100db25e87344250561a2fd981c24c8d06b166999d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_neumann, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Nov 29 01:51:20 np0005539510 podman[236687]: 2025-11-29 06:51:20.880379636 +0000 UTC m=+0.150507007 container attach d7f10d0da8f49b7c24942f100db25e87344250561a2fd981c24c8d06b166999d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_neumann, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:51:20 np0005539510 eloquent_neumann[236703]: 167 167
Nov 29 01:51:20 np0005539510 systemd[1]: libpod-d7f10d0da8f49b7c24942f100db25e87344250561a2fd981c24c8d06b166999d.scope: Deactivated successfully.
Nov 29 01:51:20 np0005539510 conmon[236703]: conmon d7f10d0da8f49b7c2494 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-d7f10d0da8f49b7c24942f100db25e87344250561a2fd981c24c8d06b166999d.scope/container/memory.events
Nov 29 01:51:20 np0005539510 podman[236687]: 2025-11-29 06:51:20.888608086 +0000 UTC m=+0.158735437 container died d7f10d0da8f49b7c24942f100db25e87344250561a2fd981c24c8d06b166999d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_neumann, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 29 01:51:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:51:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:20.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:51:20 np0005539510 systemd[1]: var-lib-containers-storage-overlay-4626f7b5634d1712807b4095bac8b1ff8d2f4f77119ab23bc2082f8a0c21e217-merged.mount: Deactivated successfully.
Nov 29 01:51:20 np0005539510 podman[236687]: 2025-11-29 06:51:20.934018874 +0000 UTC m=+0.204146215 container remove d7f10d0da8f49b7c24942f100db25e87344250561a2fd981c24c8d06b166999d (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=eloquent_neumann, io.buildah.version=1.39.3, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 01:51:20 np0005539510 systemd[1]: libpod-conmon-d7f10d0da8f49b7c24942f100db25e87344250561a2fd981c24c8d06b166999d.scope: Deactivated successfully.
Nov 29 01:51:20 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:51:21 np0005539510 podman[236730]: 2025-11-29 06:51:21.116525417 +0000 UTC m=+0.047293099 container create a3e94d9aeebddd00e3ad90adf1f35d1824ad9374c35079091aa6b50f71e2b56a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_matsumoto, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 29 01:51:21 np0005539510 systemd[1]: Started libpod-conmon-a3e94d9aeebddd00e3ad90adf1f35d1824ad9374c35079091aa6b50f71e2b56a.scope.
Nov 29 01:51:21 np0005539510 systemd[1]: Started libcrun container.
Nov 29 01:51:21 np0005539510 podman[236730]: 2025-11-29 06:51:21.099596453 +0000 UTC m=+0.030364155 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Nov 29 01:51:21 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/463ab74fa081df70c85fb330a50280b93023915477db9227bcb1c8e2dd7ef488/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 29 01:51:21 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/463ab74fa081df70c85fb330a50280b93023915477db9227bcb1c8e2dd7ef488/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 29 01:51:21 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/463ab74fa081df70c85fb330a50280b93023915477db9227bcb1c8e2dd7ef488/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 29 01:51:21 np0005539510 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/463ab74fa081df70c85fb330a50280b93023915477db9227bcb1c8e2dd7ef488/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 29 01:51:21 np0005539510 podman[236730]: 2025-11-29 06:51:21.21511785 +0000 UTC m=+0.145885592 container init a3e94d9aeebddd00e3ad90adf1f35d1824ad9374c35079091aa6b50f71e2b56a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_matsumoto, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Nov 29 01:51:21 np0005539510 podman[236730]: 2025-11-29 06:51:21.221179043 +0000 UTC m=+0.151946725 container start a3e94d9aeebddd00e3ad90adf1f35d1824ad9374c35079091aa6b50f71e2b56a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_matsumoto, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Nov 29 01:51:21 np0005539510 podman[236730]: 2025-11-29 06:51:21.228674784 +0000 UTC m=+0.159442566 container attach a3e94d9aeebddd00e3ad90adf1f35d1824ad9374c35079091aa6b50f71e2b56a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_matsumoto, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3)
Nov 29 01:51:22 np0005539510 nova_compute[231979]: 2025-11-29 06:51:22.233 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:22 np0005539510 nova_compute[231979]: 2025-11-29 06:51:22.234 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]: [
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:    {
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:        "available": false,
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:        "ceph_device": false,
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:        "device_id": "QEMU_DVD-ROM_QM00001",
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:        "lsm_data": {},
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:        "lvs": [],
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:        "path": "/dev/sr0",
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:        "rejected_reasons": [
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:            "Insufficient space (<5GB)",
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:            "Has a FileSystem"
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:        ],
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:        "sys_api": {
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:            "actuators": null,
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:            "device_nodes": "sr0",
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:            "devname": "sr0",
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:            "human_readable_size": "482.00 KB",
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:            "id_bus": "ata",
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:            "model": "QEMU DVD-ROM",
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:            "nr_requests": "2",
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:            "parent": "/dev/sr0",
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:            "partitions": {},
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:            "path": "/dev/sr0",
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:            "removable": "1",
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:            "rev": "2.5+",
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:            "ro": "0",
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:            "rotational": "1",
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:            "sas_address": "",
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:            "sas_device_handle": "",
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:            "scheduler_mode": "mq-deadline",
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:            "sectors": 0,
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:            "sectorsize": "2048",
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:            "size": 493568.0,
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:            "support_discard": "2048",
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:            "type": "disk",
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:            "vendor": "QEMU"
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:        }
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]:    }
Nov 29 01:51:22 np0005539510 flamboyant_matsumoto[236747]: ]
Nov 29 01:51:22 np0005539510 systemd[1]: libpod-a3e94d9aeebddd00e3ad90adf1f35d1824ad9374c35079091aa6b50f71e2b56a.scope: Deactivated successfully.
Nov 29 01:51:22 np0005539510 systemd[1]: libpod-a3e94d9aeebddd00e3ad90adf1f35d1824ad9374c35079091aa6b50f71e2b56a.scope: Consumed 1.157s CPU time.
Nov 29 01:51:22 np0005539510 podman[236730]: 2025-11-29 06:51:22.371158805 +0000 UTC m=+1.301926507 container died a3e94d9aeebddd00e3ad90adf1f35d1824ad9374c35079091aa6b50f71e2b56a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_matsumoto, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:51:22 np0005539510 systemd[1]: var-lib-containers-storage-overlay-463ab74fa081df70c85fb330a50280b93023915477db9227bcb1c8e2dd7ef488-merged.mount: Deactivated successfully.
Nov 29 01:51:22 np0005539510 podman[236730]: 2025-11-29 06:51:22.431433041 +0000 UTC m=+1.362200723 container remove a3e94d9aeebddd00e3ad90adf1f35d1824ad9374c35079091aa6b50f71e2b56a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=flamboyant_matsumoto, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Nov 29 01:51:22 np0005539510 systemd[1]: libpod-conmon-a3e94d9aeebddd00e3ad90adf1f35d1824ad9374c35079091aa6b50f71e2b56a.scope: Deactivated successfully.
Nov 29 01:51:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:22.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:22.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:23 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:51:23 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:51:23 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:51:23 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:51:23 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:51:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:51:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:24.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:51:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:24.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:25 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:51:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:26.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:26.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:28 np0005539510 nova_compute[231979]: 2025-11-29 06:51:28.275 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 6.58 sec#033[00m
Nov 29 01:51:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.005000134s ======
Nov 29 01:51:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:28.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.005000134s
Nov 29 01:51:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:28.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:29 np0005539510 nova_compute[231979]: 2025-11-29 06:51:29.385 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:29 np0005539510 nova_compute[231979]: 2025-11-29 06:51:29.385 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:51:29 np0005539510 nova_compute[231979]: 2025-11-29 06:51:29.385 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:51:30 np0005539510 nova_compute[231979]: 2025-11-29 06:51:30.500 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:51:30 np0005539510 nova_compute[231979]: 2025-11-29 06:51:30.500 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:30 np0005539510 nova_compute[231979]: 2025-11-29 06:51:30.500 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:30 np0005539510 nova_compute[231979]: 2025-11-29 06:51:30.500 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:30 np0005539510 nova_compute[231979]: 2025-11-29 06:51:30.501 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:30 np0005539510 nova_compute[231979]: 2025-11-29 06:51:30.501 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:30 np0005539510 nova_compute[231979]: 2025-11-29 06:51:30.501 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:30 np0005539510 nova_compute[231979]: 2025-11-29 06:51:30.501 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:51:30 np0005539510 nova_compute[231979]: 2025-11-29 06:51:30.501 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:30.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:30.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:30 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:51:31 np0005539510 nova_compute[231979]: 2025-11-29 06:51:31.080 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:31 np0005539510 nova_compute[231979]: 2025-11-29 06:51:31.081 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:31 np0005539510 nova_compute[231979]: 2025-11-29 06:51:31.081 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:31 np0005539510 nova_compute[231979]: 2025-11-29 06:51:31.081 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:51:31 np0005539510 nova_compute[231979]: 2025-11-29 06:51:31.082 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:51:31 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:51:31 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2305942895' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:51:31 np0005539510 nova_compute[231979]: 2025-11-29 06:51:31.552 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:51:31 np0005539510 nova_compute[231979]: 2025-11-29 06:51:31.710 231983 WARNING nova.virt.libvirt.driver [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:51:31 np0005539510 nova_compute[231979]: 2025-11-29 06:51:31.712 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5255MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:51:31 np0005539510 nova_compute[231979]: 2025-11-29 06:51:31.712 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:31 np0005539510 nova_compute[231979]: 2025-11-29 06:51:31.712 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:51:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:32.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:51:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:51:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:32.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:51:32 np0005539510 nova_compute[231979]: 2025-11-29 06:51:32.994 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:51:32 np0005539510 nova_compute[231979]: 2025-11-29 06:51:32.994 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:51:33 np0005539510 nova_compute[231979]: 2025-11-29 06:51:33.070 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:51:33 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:51:33 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1735230033' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:51:33 np0005539510 nova_compute[231979]: 2025-11-29 06:51:33.511 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:51:33 np0005539510 nova_compute[231979]: 2025-11-29 06:51:33.518 231983 DEBUG nova.compute.provider_tree [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed in ProviderTree for provider: 98b21ca7-b42c-4765-935a-26a89197ffb9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:51:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:34.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:34.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:35 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:51:36 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:51:36 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:51:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:36.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:36.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:38.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:38.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:40 np0005539510 nova_compute[231979]: 2025-11-29 06:51:40.527 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:51:40 np0005539510 nova_compute[231979]: 2025-11-29 06:51:40.528 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:51:40 np0005539510 nova_compute[231979]: 2025-11-29 06:51:40.529 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 8.817s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:40.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:40 np0005539510 podman[238101]: 2025-11-29 06:51:40.918067538 +0000 UTC m=+0.071006875 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:51:40 np0005539510 podman[238102]: 2025-11-29 06:51:40.932571837 +0000 UTC m=+0.090075496 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 29 01:51:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:40.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:40 np0005539510 podman[238100]: 2025-11-29 06:51:40.959063647 +0000 UTC m=+0.108794748 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 29 01:51:40 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:51:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:51:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:42.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:51:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:42.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:44.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:44.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:45 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:51:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:51:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:46.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:51:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:46.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:51:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:48.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:51:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:51:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:48.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:51:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:51:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:50.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:51:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:51:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:50.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:51:50 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:51:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:51:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:52.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:51:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:51:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:52.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:51:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:54.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:54.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:55 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:51:56 np0005539510 nova_compute[231979]: 2025-11-29 06:51:56.044 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 7.77 sec#033[00m
Nov 29 01:51:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:51:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:56.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:51:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:56.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:51:58.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:51:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:51:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:51:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:51:58.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:00.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:00.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:00 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:52:02 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Nov 29 01:52:02 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3126230656' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 01:52:02 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Nov 29 01:52:02 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3126230656' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 01:52:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:02.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:02.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:04.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:52:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:04.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:52:06 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:52:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:52:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:06.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:52:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:06.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:52:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:08.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:52:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:08.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:52:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:10.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:52:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:11.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:11 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:52:11 np0005539510 podman[238233]: 2025-11-29 06:52:11.890621933 +0000 UTC m=+0.050884783 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible)
Nov 29 01:52:11 np0005539510 podman[238234]: 2025-11-29 06:52:11.901663428 +0000 UTC m=+0.058501536 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:52:11 np0005539510 podman[238232]: 2025-11-29 06:52:11.919819464 +0000 UTC m=+0.082505469 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:52:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:12.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:13.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:14.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:15.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:52:15.146 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:52:15.147 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:52:15.147 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:16 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:52:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:16.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:17.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:52:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:18.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:52:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:52:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:19.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:52:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:20.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:21.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:21 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:52:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:22.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:23.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:24.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:25.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:26 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:52:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:26.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:27.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.502552) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399147502605, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 2358, "num_deletes": 251, "total_data_size": 6203043, "memory_usage": 6281232, "flush_reason": "Manual Compaction"}
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399147527147, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 4024087, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 21119, "largest_seqno": 23472, "table_properties": {"data_size": 4014455, "index_size": 6126, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19000, "raw_average_key_size": 20, "raw_value_size": 3995455, "raw_average_value_size": 4214, "num_data_blocks": 274, "num_entries": 948, "num_filter_entries": 948, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764398925, "oldest_key_time": 1764398925, "file_creation_time": 1764399147, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 24643 microseconds, and 11073 cpu microseconds.
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.527195) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 4024087 bytes OK
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.527217) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.530960) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.530992) EVENT_LOG_v1 {"time_micros": 1764399147530985, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.531010) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 6192706, prev total WAL file size 6192706, number of live WAL files 2.
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.532423) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031353036' seq:72057594037927935, type:22 .. '7061786F730031373538' seq:0, type:0; will stop at (end)
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(3929KB)], [42(7197KB)]
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399147532479, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 11394557, "oldest_snapshot_seqno": -1}
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 5132 keys, 9349035 bytes, temperature: kUnknown
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399147605240, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 9349035, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9314272, "index_size": 20829, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12869, "raw_key_size": 128878, "raw_average_key_size": 25, "raw_value_size": 9220817, "raw_average_value_size": 1796, "num_data_blocks": 857, "num_entries": 5132, "num_filter_entries": 5132, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397158, "oldest_key_time": 0, "file_creation_time": 1764399147, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.605481) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 9349035 bytes
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.607177) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 156.4 rd, 128.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.8, 7.0 +0.0 blob) out(8.9 +0.0 blob), read-write-amplify(5.2) write-amplify(2.3) OK, records in: 5651, records dropped: 519 output_compression: NoCompression
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.607199) EVENT_LOG_v1 {"time_micros": 1764399147607189, "job": 24, "event": "compaction_finished", "compaction_time_micros": 72833, "compaction_time_cpu_micros": 22100, "output_level": 6, "num_output_files": 1, "total_output_size": 9349035, "num_input_records": 5651, "num_output_records": 5132, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399147608165, "job": 24, "event": "table_file_deletion", "file_number": 44}
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399147609760, "job": 24, "event": "table_file_deletion", "file_number": 42}
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.532367) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.609838) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.609843) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.609844) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.609846) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:52:27 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:27.609848) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:52:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:52:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:28.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:52:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:29.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.166898) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399149166932, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 262, "num_deletes": 256, "total_data_size": 20532, "memory_usage": 27256, "flush_reason": "Manual Compaction"}
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399149168602, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 13142, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23474, "largest_seqno": 23734, "table_properties": {"data_size": 11326, "index_size": 49, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 4242, "raw_average_key_size": 16, "raw_value_size": 7889, "raw_average_value_size": 30, "num_data_blocks": 2, "num_entries": 261, "num_filter_entries": 261, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764399149, "oldest_key_time": 1764399149, "file_creation_time": 1764399149, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 1785 microseconds, and 640 cpu microseconds.
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.168682) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 13142 bytes OK
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.168734) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.169783) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.169833) EVENT_LOG_v1 {"time_micros": 1764399149169826, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.169851) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 18466, prev total WAL file size 18466, number of live WAL files 2.
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.170369) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00323534' seq:72057594037927935, type:22 .. '6C6F676D00353036' seq:0, type:0; will stop at (end)
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(12KB)], [45(9129KB)]
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399149170401, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 9362177, "oldest_snapshot_seqno": -1}
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 4877 keys, 9228475 bytes, temperature: kUnknown
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399149233288, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 9228475, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9194913, "index_size": 20268, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12229, "raw_key_size": 124775, "raw_average_key_size": 25, "raw_value_size": 9105384, "raw_average_value_size": 1867, "num_data_blocks": 828, "num_entries": 4877, "num_filter_entries": 4877, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397158, "oldest_key_time": 0, "file_creation_time": 1764399149, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.233519) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 9228475 bytes
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.234944) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 148.7 rd, 146.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.0, 8.9 +0.0 blob) out(8.8 +0.0 blob), read-write-amplify(1414.6) write-amplify(702.2) OK, records in: 5393, records dropped: 516 output_compression: NoCompression
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.234966) EVENT_LOG_v1 {"time_micros": 1764399149234956, "job": 26, "event": "compaction_finished", "compaction_time_micros": 62960, "compaction_time_cpu_micros": 18764, "output_level": 6, "num_output_files": 1, "total_output_size": 9228475, "num_input_records": 5393, "num_output_records": 4877, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399149235083, "job": 26, "event": "table_file_deletion", "file_number": 47}
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399149236972, "job": 26, "event": "table_file_deletion", "file_number": 45}
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.170310) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.237085) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.237091) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.237092) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.237094) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:52:29 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:52:29.237096) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:52:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:52:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:30.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:52:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:31.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:31 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:52:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:52:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:32.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:52:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:52:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:33.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:52:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:34.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:35.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:36 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:52:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:36.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:52:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:37.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:52:37 np0005539510 podman[238583]: 2025-11-29 06:52:37.062633199 +0000 UTC m=+0.089104065 container exec 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507)
Nov 29 01:52:37 np0005539510 podman[238583]: 2025-11-29 06:52:37.17106268 +0000 UTC m=+0.197533536 container exec_died 39a71a201814a8b91517123ef896eee9fba9aa58250e8991d52c95150c3afcf1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-mon-compute-2, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Nov 29 01:52:37 np0005539510 podman[238733]: 2025-11-29 06:52:37.867660309 +0000 UTC m=+0.052782383 container exec e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 01:52:37 np0005539510 podman[238733]: 2025-11-29 06:52:37.878105238 +0000 UTC m=+0.063227282 container exec_died e9146d28074e77024b71630777a636a9be1f96d6c8de7fe1d0f6b8ea9b7c2629 (image=quay.io/ceph/haproxy:2.3, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-haproxy-rgw-default-compute-2-lpqgfx)
Nov 29 01:52:38 np0005539510 podman[238797]: 2025-11-29 06:52:38.063444187 +0000 UTC m=+0.047766199 container exec d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, build-date=2023-02-22T09:23:20, architecture=x86_64, release=1793, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.28.2, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, vcs-type=git, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 01:52:38 np0005539510 podman[238797]: 2025-11-29 06:52:38.078220213 +0000 UTC m=+0.062542245 container exec_died d5351020227e3ec7123cdc8e246da28574527444c7ccd1e10dc70576b52ccb35 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-336ec58c-893b-528f-a0c1-6ed1196bc047-keepalived-rgw-default-compute-2-klqjoa, vendor=Red Hat, Inc., version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2023-02-22T09:23:20, io.openshift.expose-services=, io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, name=keepalived, description=keepalived for Ceph)
Nov 29 01:52:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:38.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:39.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:39 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:52:39 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:52:39 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Nov 29 01:52:39 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:52:40 np0005539510 nova_compute[231979]: 2025-11-29 06:52:40.531 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:52:40 np0005539510 nova_compute[231979]: 2025-11-29 06:52:40.531 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:52:40 np0005539510 nova_compute[231979]: 2025-11-29 06:52:40.531 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:52:40 np0005539510 nova_compute[231979]: 2025-11-29 06:52:40.532 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:52:40 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:52:40 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:52:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:40.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:41.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:41 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:52:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:42.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:42 np0005539510 podman[238967]: 2025-11-29 06:52:42.914612057 +0000 UTC m=+0.063543182 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:52:42 np0005539510 podman[238966]: 2025-11-29 06:52:42.935030833 +0000 UTC m=+0.089322771 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:52:42 np0005539510 podman[238965]: 2025-11-29 06:52:42.94054894 +0000 UTC m=+0.094268813 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller)
Nov 29 01:52:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:43.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:52:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:44.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:52:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:45.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:46 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:52:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:46.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:47.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:47 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:52:47 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:52:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:48.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:49.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:50.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:52:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:51.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:52:51 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:52:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:52:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:52.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:52:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:52:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:53.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:52:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:54.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:55.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:56 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:52:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:56.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:57.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:52:58.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:52:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:52:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:52:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:52:59.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:00.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:01.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:01 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:53:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:02.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:03.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:04.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:53:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:05.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:53:06 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:53:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:53:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:06.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:53:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:07.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:08.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:53:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:09.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:53:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:10.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:11.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:11 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:53:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:12.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:13.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:13 np0005539510 podman[239142]: 2025-11-29 06:53:13.925725466 +0000 UTC m=+0.076093977 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 01:53:13 np0005539510 podman[239143]: 2025-11-29 06:53:13.938349854 +0000 UTC m=+0.078792839 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd)
Nov 29 01:53:13 np0005539510 podman[239141]: 2025-11-29 06:53:13.96359792 +0000 UTC m=+0.111626238 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 01:53:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:14.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:15.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:53:15.147 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:53:15.148 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:53:15.148 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:16 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:53:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:53:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:16.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:53:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:17.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:18 np0005539510 nova_compute[231979]: 2025-11-29 06:53:18.472 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:53:18 np0005539510 nova_compute[231979]: 2025-11-29 06:53:18.473 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:18 np0005539510 nova_compute[231979]: 2025-11-29 06:53:18.473 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:18 np0005539510 nova_compute[231979]: 2025-11-29 06:53:18.473 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:18 np0005539510 nova_compute[231979]: 2025-11-29 06:53:18.474 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:18 np0005539510 nova_compute[231979]: 2025-11-29 06:53:18.474 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:18 np0005539510 nova_compute[231979]: 2025-11-29 06:53:18.474 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:18 np0005539510 nova_compute[231979]: 2025-11-29 06:53:18.474 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:53:18 np0005539510 nova_compute[231979]: 2025-11-29 06:53:18.475 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:18.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:18 np0005539510 nova_compute[231979]: 2025-11-29 06:53:18.923 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 52.88 sec#033[00m
Nov 29 01:53:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:53:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:19.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:53:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:20.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:21 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:53:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:21.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:22.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:23.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:23 np0005539510 nova_compute[231979]: 2025-11-29 06:53:23.181 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:23 np0005539510 nova_compute[231979]: 2025-11-29 06:53:23.181 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:23 np0005539510 nova_compute[231979]: 2025-11-29 06:53:23.182 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:23 np0005539510 nova_compute[231979]: 2025-11-29 06:53:23.182 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:53:23 np0005539510 nova_compute[231979]: 2025-11-29 06:53:23.183 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:23 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:53:23 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2185145600' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:53:23 np0005539510 nova_compute[231979]: 2025-11-29 06:53:23.598 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:23 np0005539510 nova_compute[231979]: 2025-11-29 06:53:23.764 231983 WARNING nova.virt.libvirt.driver [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:53:23 np0005539510 nova_compute[231979]: 2025-11-29 06:53:23.766 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5280MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:53:23 np0005539510 nova_compute[231979]: 2025-11-29 06:53:23.766 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:23 np0005539510 nova_compute[231979]: 2025-11-29 06:53:23.766 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:24.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:25.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:26 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:53:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:26.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:27.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:28.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:29.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:29 np0005539510 nova_compute[231979]: 2025-11-29 06:53:29.173 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 0.25 sec#033[00m
Nov 29 01:53:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:30.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:31 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:53:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:31.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:32.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:33.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:53:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:34.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:53:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:35.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:36 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:53:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:36.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:53:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:37.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:53:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:38.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:39.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:40.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:41 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:53:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:41.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:42.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:53:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:43.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:53:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:44.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:44 np0005539510 podman[239341]: 2025-11-29 06:53:44.894996535 +0000 UTC m=+0.053534513 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 01:53:44 np0005539510 podman[239340]: 2025-11-29 06:53:44.909972896 +0000 UTC m=+0.064546868 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:53:44 np0005539510 podman[239339]: 2025-11-29 06:53:44.92360315 +0000 UTC m=+0.089473805 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 01:53:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:45.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:46 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:53:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:46.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:53:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:47.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:53:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:48.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:49 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:53:49 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:53:49 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:53:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:49.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:50.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:51 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:53:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:51.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:52 np0005539510 nova_compute[231979]: 2025-11-29 06:53:51.999 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:53:52 np0005539510 nova_compute[231979]: 2025-11-29 06:53:51.999 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:53:52 np0005539510 nova_compute[231979]: 2025-11-29 06:53:52.045 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:52 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:53:52 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3731896802' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:53:52 np0005539510 nova_compute[231979]: 2025-11-29 06:53:52.480 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:52 np0005539510 nova_compute[231979]: 2025-11-29 06:53:52.487 231983 DEBUG nova.compute.provider_tree [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed in ProviderTree for provider: 98b21ca7-b42c-4765-935a-26a89197ffb9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:53:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:52.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:53.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:54 np0005539510 nova_compute[231979]: 2025-11-29 06:53:54.475 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:53:54 np0005539510 nova_compute[231979]: 2025-11-29 06:53:54.479 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:53:54 np0005539510 nova_compute[231979]: 2025-11-29 06:53:54.480 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 30.713s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:54 np0005539510 nova_compute[231979]: 2025-11-29 06:53:54.481 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:54 np0005539510 nova_compute[231979]: 2025-11-29 06:53:54.481 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 01:53:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:54.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:55.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:56 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:53:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:56.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:57.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:53:58.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:53:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:53:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:53:59.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:53:59 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:53:59 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:54:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:00.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:01 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:54:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:54:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:01.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:54:02 np0005539510 nova_compute[231979]: 2025-11-29 06:54:02.642 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 13.47 sec#033[00m
Nov 29 01:54:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:02.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:03 np0005539510 nova_compute[231979]: 2025-11-29 06:54:03.047 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 01:54:03 np0005539510 nova_compute[231979]: 2025-11-29 06:54:03.048 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:03 np0005539510 nova_compute[231979]: 2025-11-29 06:54:03.048 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 01:54:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:03.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:04 np0005539510 nova_compute[231979]: 2025-11-29 06:54:04.485 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:54:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:04.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:54:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:54:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:05.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:54:06 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:54:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:06.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:54:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:07.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:54:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:08.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:09.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:10.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:54:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:11.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:54:11 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:54:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:54:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:12.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:54:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:13.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:14.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:54:15.148 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:54:15.149 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:54:15.149 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:15.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:15 np0005539510 podman[239671]: 2025-11-29 06:54:15.89367084 +0000 UTC m=+0.054742320 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 01:54:15 np0005539510 podman[239670]: 2025-11-29 06:54:15.905703433 +0000 UTC m=+0.067002159 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 01:54:15 np0005539510 podman[239669]: 2025-11-29 06:54:15.945251753 +0000 UTC m=+0.111924403 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible)
Nov 29 01:54:16 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:54:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:16.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:17.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:18.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:19.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:19 np0005539510 nova_compute[231979]: 2025-11-29 06:54:19.737 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:19 np0005539510 nova_compute[231979]: 2025-11-29 06:54:19.737 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:54:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:20.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:54:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:21.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:21 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:54:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:22.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:23.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:24.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:54:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:25.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:54:26 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:54:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:26.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:54:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:27.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:54:27 np0005539510 nova_compute[231979]: 2025-11-29 06:54:27.876 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:27 np0005539510 nova_compute[231979]: 2025-11-29 06:54:27.877 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:54:27 np0005539510 nova_compute[231979]: 2025-11-29 06:54:27.877 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:54:28 np0005539510 nova_compute[231979]: 2025-11-29 06:54:28.894 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 6.25 sec#033[00m
Nov 29 01:54:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:28.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:54:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:29.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:54:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:30.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:31.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:31 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:54:32 np0005539510 nova_compute[231979]: 2025-11-29 06:54:32.725 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:54:32 np0005539510 nova_compute[231979]: 2025-11-29 06:54:32.725 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:32 np0005539510 nova_compute[231979]: 2025-11-29 06:54:32.725 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:32 np0005539510 nova_compute[231979]: 2025-11-29 06:54:32.726 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:32 np0005539510 nova_compute[231979]: 2025-11-29 06:54:32.726 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:32 np0005539510 nova_compute[231979]: 2025-11-29 06:54:32.726 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:32 np0005539510 nova_compute[231979]: 2025-11-29 06:54:32.726 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:54:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:32.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:54:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:33.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:34.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:35.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:36 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:54:36 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Nov 29 01:54:36 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3604148447' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 01:54:36 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Nov 29 01:54:36 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3604148447' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 01:54:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:36.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:37.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:38.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:39.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:40.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:41.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:41 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:54:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:54:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:42.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:54:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:54:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:43.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:54:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:44.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:45.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:46 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:54:46 np0005539510 podman[239850]: 2025-11-29 06:54:46.886646976 +0000 UTC m=+0.047875315 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Nov 29 01:54:46 np0005539510 podman[239851]: 2025-11-29 06:54:46.895019701 +0000 UTC m=+0.054693818 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 01:54:46 np0005539510 podman[239849]: 2025-11-29 06:54:46.917644898 +0000 UTC m=+0.079708889 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 29 01:54:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:46.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:47.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:48.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:48 np0005539510 nova_compute[231979]: 2025-11-29 06:54:48.954 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:48 np0005539510 nova_compute[231979]: 2025-11-29 06:54:48.954 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:54:48 np0005539510 nova_compute[231979]: 2025-11-29 06:54:48.954 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:49 np0005539510 nova_compute[231979]: 2025-11-29 06:54:49.154 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 0.26 sec#033[00m
Nov 29 01:54:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:54:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:49.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:54:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:50.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:51.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:52.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:53 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:54:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:53.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:54 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:54 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:54 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:54.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:55.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:56 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:56 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:56 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:56.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:57.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:58 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:54:58 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:58 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:58 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:54:58.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:54:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:54:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:54:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:54:59.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:00 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:00 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:00 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:00.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:01.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:01 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:55:01 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:55:01 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:55:02 np0005539510 nova_compute[231979]: 2025-11-29 06:55:02.246 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:55:02 np0005539510 nova_compute[231979]: 2025-11-29 06:55:02.247 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:55:02 np0005539510 nova_compute[231979]: 2025-11-29 06:55:02.247 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:55:02 np0005539510 nova_compute[231979]: 2025-11-29 06:55:02.247 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:55:02 np0005539510 nova_compute[231979]: 2025-11-29 06:55:02.248 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:55:02 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:55:02 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2982005399' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:55:02 np0005539510 nova_compute[231979]: 2025-11-29 06:55:02.775 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.527s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:55:02 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:02 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:02 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:02.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:02 np0005539510 nova_compute[231979]: 2025-11-29 06:55:02.949 231983 WARNING nova.virt.libvirt.driver [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:55:02 np0005539510 nova_compute[231979]: 2025-11-29 06:55:02.951 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5261MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:55:02 np0005539510 nova_compute[231979]: 2025-11-29 06:55:02.951 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:55:02 np0005539510 nova_compute[231979]: 2025-11-29 06:55:02.951 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:55:03 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:55:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:03.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:04 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:04 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:04 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:04.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:05.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:06 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:06 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:06 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:06.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:07.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:08 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:55:08 np0005539510 nova_compute[231979]: 2025-11-29 06:55:08.324 231983 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 9.17 sec#033[00m
Nov 29 01:55:08 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:08 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:55:08 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:08.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:55:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:09.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:10 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:10 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:55:10 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:10.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:55:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:55:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:11.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:55:12 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:12 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:12 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:12.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:13 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:55:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:13.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:14 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:14 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:14 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:14.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:55:15.149 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:55:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:55:15.149 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:55:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:55:15.149 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:55:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:55:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:15.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:55:15 np0005539510 nova_compute[231979]: 2025-11-29 06:55:15.359 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:55:15 np0005539510 nova_compute[231979]: 2025-11-29 06:55:15.360 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:55:15 np0005539510 nova_compute[231979]: 2025-11-29 06:55:15.390 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:55:15 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:55:15 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3008667807' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:55:15 np0005539510 nova_compute[231979]: 2025-11-29 06:55:15.795 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.404s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:55:15 np0005539510 nova_compute[231979]: 2025-11-29 06:55:15.800 231983 DEBUG nova.compute.provider_tree [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed in ProviderTree for provider: 98b21ca7-b42c-4765-935a-26a89197ffb9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:55:15 np0005539510 nova_compute[231979]: 2025-11-29 06:55:15.902 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:55:15 np0005539510 nova_compute[231979]: 2025-11-29 06:55:15.904 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:55:15 np0005539510 nova_compute[231979]: 2025-11-29 06:55:15.904 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 12.953s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:55:16 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:16 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:16 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:16.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:17.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:17 np0005539510 podman[240155]: 2025-11-29 06:55:17.912356942 +0000 UTC m=+0.066633918 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:55:17 np0005539510 podman[240156]: 2025-11-29 06:55:17.976686988 +0000 UTC m=+0.119271570 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd)
Nov 29 01:55:17 np0005539510 podman[240154]: 2025-11-29 06:55:17.99353429 +0000 UTC m=+0.150698893 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:55:18 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:55:18 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:18 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:18 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:18.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:55:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:19.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:55:20 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:20 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:55:20 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:20.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:55:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:21.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:21 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:55:21 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:55:22 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:22 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:55:22 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:22.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:55:23 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:55:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:55:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:23.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:55:24 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:24 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:55:24 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:24.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:55:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:25.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:26 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:26 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:26 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:26.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:27.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:28 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:55:28 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:28 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:28 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:28.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:29.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:30 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:30 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:30 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:30.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:31.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:32 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:32 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:32 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:32.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:33 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:55:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:55:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:33.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:55:34 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:34 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:34 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:34.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:35.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:36 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Nov 29 01:55:36 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/212983180' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 01:55:36 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Nov 29 01:55:36 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/212983180' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 01:55:36 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:36 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:36 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:36.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:37.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:38 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:55:38 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:38 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:38 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:38.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:39.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:40 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:40 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:40 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:40.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:41.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:42 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:42 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:55:42 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:42.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:55:43 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:55:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:43.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:44 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:44 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:44 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:44.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:45.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:46 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:46 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:46 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:46.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:55:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:47.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:55:48 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:55:48 np0005539510 podman[240384]: 2025-11-29 06:55:48.918527894 +0000 UTC m=+0.060882114 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:55:48 np0005539510 podman[240385]: 2025-11-29 06:55:48.927420443 +0000 UTC m=+0.062778085 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Nov 29 01:55:48 np0005539510 podman[240383]: 2025-11-29 06:55:48.943222657 +0000 UTC m=+0.086059970 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:55:48 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:48 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:55:48 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:48.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:55:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:49.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:50 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:50 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:55:50 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:50.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:55:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:51.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:52 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:52 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:52 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:52.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:53 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:55:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:53.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:54.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:55.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:57.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:57.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:58 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:55:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:55:59.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:55:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:55:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:55:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:55:59.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:01.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:01.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:03.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:03 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:56:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:03.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:05.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:05.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:07.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:07.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:08 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:56:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:09.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:09.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:11.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:11.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:13.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:13 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:56:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:13.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:15.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:56:15.149 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:56:15.150 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:56:15.150 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:15.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:15 np0005539510 nova_compute[231979]: 2025-11-29 06:56:15.906 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:15 np0005539510 nova_compute[231979]: 2025-11-29 06:56:15.907 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:17.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:17.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:17 np0005539510 nova_compute[231979]: 2025-11-29 06:56:17.901 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:17 np0005539510 nova_compute[231979]: 2025-11-29 06:56:17.901 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:56:17 np0005539510 nova_compute[231979]: 2025-11-29 06:56:17.901 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:56:18 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:56:18 np0005539510 nova_compute[231979]: 2025-11-29 06:56:18.291 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:56:18 np0005539510 nova_compute[231979]: 2025-11-29 06:56:18.292 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:18 np0005539510 nova_compute[231979]: 2025-11-29 06:56:18.292 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:18 np0005539510 nova_compute[231979]: 2025-11-29 06:56:18.292 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:18 np0005539510 nova_compute[231979]: 2025-11-29 06:56:18.293 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:18 np0005539510 nova_compute[231979]: 2025-11-29 06:56:18.293 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:18 np0005539510 nova_compute[231979]: 2025-11-29 06:56:18.293 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:18 np0005539510 nova_compute[231979]: 2025-11-29 06:56:18.293 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:56:18 np0005539510 nova_compute[231979]: 2025-11-29 06:56:18.293 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:18 np0005539510 nova_compute[231979]: 2025-11-29 06:56:18.497 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:18 np0005539510 nova_compute[231979]: 2025-11-29 06:56:18.498 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:18 np0005539510 nova_compute[231979]: 2025-11-29 06:56:18.498 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:18 np0005539510 nova_compute[231979]: 2025-11-29 06:56:18.498 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:56:18 np0005539510 nova_compute[231979]: 2025-11-29 06:56:18.499 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:18 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:56:18 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/217955619' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:56:18 np0005539510 nova_compute[231979]: 2025-11-29 06:56:18.938 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:19.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:19 np0005539510 podman[240556]: 2025-11-29 06:56:19.095836676 +0000 UTC m=+0.056466840 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3)
Nov 29 01:56:19 np0005539510 podman[240554]: 2025-11-29 06:56:19.114859187 +0000 UTC m=+0.082608502 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 29 01:56:19 np0005539510 podman[240555]: 2025-11-29 06:56:19.12089451 +0000 UTC m=+0.085708836 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Nov 29 01:56:19 np0005539510 nova_compute[231979]: 2025-11-29 06:56:19.147 231983 WARNING nova.virt.libvirt.driver [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:56:19 np0005539510 nova_compute[231979]: 2025-11-29 06:56:19.149 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5275MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:56:19 np0005539510 nova_compute[231979]: 2025-11-29 06:56:19.149 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:19 np0005539510 nova_compute[231979]: 2025-11-29 06:56:19.149 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.380444) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399379380508, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 2336, "num_deletes": 251, "total_data_size": 5965364, "memory_usage": 6051520, "flush_reason": "Manual Compaction"}
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Nov 29 01:56:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:19.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399379404572, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 3916609, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23739, "largest_seqno": 26070, "table_properties": {"data_size": 3907165, "index_size": 6002, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2373, "raw_key_size": 18818, "raw_average_key_size": 20, "raw_value_size": 3888452, "raw_average_value_size": 4154, "num_data_blocks": 268, "num_entries": 936, "num_filter_entries": 936, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764399149, "oldest_key_time": 1764399149, "file_creation_time": 1764399379, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 24154 microseconds, and 11347 cpu microseconds.
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.404618) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 3916609 bytes OK
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.404639) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.405952) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.405967) EVENT_LOG_v1 {"time_micros": 1764399379405962, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.405982) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 5955169, prev total WAL file size 5955169, number of live WAL files 2.
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.407096) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031373537' seq:72057594037927935, type:22 .. '7061786F730032303039' seq:0, type:0; will stop at (end)
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(3824KB)], [48(9012KB)]
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399379407140, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 13145084, "oldest_snapshot_seqno": -1}
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 5296 keys, 11154011 bytes, temperature: kUnknown
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399379488409, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 11154011, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11116174, "index_size": 23519, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13253, "raw_key_size": 134023, "raw_average_key_size": 25, "raw_value_size": 11017725, "raw_average_value_size": 2080, "num_data_blocks": 968, "num_entries": 5296, "num_filter_entries": 5296, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764397158, "oldest_key_time": 0, "file_creation_time": 1764399379, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "70291dfa-fb4b-4030-8b2f-275b626805e0", "db_session_id": "VR5455MVOXQY2YZBKO9G", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.488637) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 11154011 bytes
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.490200) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 161.6 rd, 137.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.7, 8.8 +0.0 blob) out(10.6 +0.0 blob), read-write-amplify(6.2) write-amplify(2.8) OK, records in: 5813, records dropped: 517 output_compression: NoCompression
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.490219) EVENT_LOG_v1 {"time_micros": 1764399379490209, "job": 28, "event": "compaction_finished", "compaction_time_micros": 81329, "compaction_time_cpu_micros": 26075, "output_level": 6, "num_output_files": 1, "total_output_size": 11154011, "num_input_records": 5813, "num_output_records": 5296, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399379491033, "job": 28, "event": "table_file_deletion", "file_number": 50}
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764399379492854, "job": 28, "event": "table_file_deletion", "file_number": 48}
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.407036) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.492958) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.492965) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.492966) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.492969) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:56:19 np0005539510 ceph-mon[77142]: rocksdb: (Original Log Time 2025/11/29-06:56:19.492971) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 29 01:56:19 np0005539510 nova_compute[231979]: 2025-11-29 06:56:19.658 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:56:19 np0005539510 nova_compute[231979]: 2025-11-29 06:56:19.659 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:56:19 np0005539510 nova_compute[231979]: 2025-11-29 06:56:19.677 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Refreshing inventories for resource provider 98b21ca7-b42c-4765-935a-26a89197ffb9 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 01:56:19 np0005539510 nova_compute[231979]: 2025-11-29 06:56:19.718 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Updating ProviderTree inventory for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 01:56:19 np0005539510 nova_compute[231979]: 2025-11-29 06:56:19.719 231983 DEBUG nova.compute.provider_tree [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Updating inventory in ProviderTree for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 01:56:19 np0005539510 nova_compute[231979]: 2025-11-29 06:56:19.734 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Refreshing aggregate associations for resource provider 98b21ca7-b42c-4765-935a-26a89197ffb9, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 01:56:19 np0005539510 nova_compute[231979]: 2025-11-29 06:56:19.751 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Refreshing trait associations for resource provider 98b21ca7-b42c-4765-935a-26a89197ffb9, traits: COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE2,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_USB,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 01:56:19 np0005539510 nova_compute[231979]: 2025-11-29 06:56:19.977 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:20 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:56:20 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2239183731' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:56:20 np0005539510 nova_compute[231979]: 2025-11-29 06:56:20.426 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:20 np0005539510 nova_compute[231979]: 2025-11-29 06:56:20.431 231983 DEBUG nova.compute.provider_tree [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed in ProviderTree for provider: 98b21ca7-b42c-4765-935a-26a89197ffb9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:56:20 np0005539510 nova_compute[231979]: 2025-11-29 06:56:20.529 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:56:20 np0005539510 nova_compute[231979]: 2025-11-29 06:56:20.530 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:56:20 np0005539510 nova_compute[231979]: 2025-11-29 06:56:20.531 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.382s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:21.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:21.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:22 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:56:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:23.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:23 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:56:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:23.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:23 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:56:23 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:56:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:56:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:25.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:56:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:56:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:25.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:56:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:27.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:27.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:28 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:56:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:56:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:29.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:56:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:29.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:31.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:31.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:33.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:33 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:56:33 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:56:33 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:56:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:56:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:33.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:56:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:56:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:35.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:56:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:35.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:37.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:37.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:38 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:56:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:39.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:56:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:39.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:56:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:56:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:41.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:56:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:41.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:43.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:43 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:56:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:43.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:45.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:45.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:47.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:47.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:48 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:56:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:49.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:49.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:49 np0005539510 podman[240913]: 2025-11-29 06:56:49.924595009 +0000 UTC m=+0.081920924 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 01:56:49 np0005539510 podman[240914]: 2025-11-29 06:56:49.935852242 +0000 UTC m=+0.081531874 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3)
Nov 29 01:56:49 np0005539510 podman[240912]: 2025-11-29 06:56:49.95773593 +0000 UTC m=+0.117473650 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:56:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:56:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:51.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:56:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:51.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:56:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:53.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:56:53 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:56:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:56:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:53.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:56:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:55.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:55.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:56:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:57.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:56:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:56:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:57.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:56:58 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:56:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:56:59.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:56:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:56:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:56:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:56:59.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:01.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:01.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:57:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:03.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:57:03 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:57:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:03.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:05.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:05.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:57:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:07.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:57:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:57:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:07.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:57:08 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:57:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:09.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:57:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:09.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:57:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:11.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:57:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:11.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:57:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:57:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:13.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:57:13 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:57:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:13.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:15.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:57:15.150 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:57:15.150 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:57:15.151 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:15.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:57:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:17.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:57:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:57:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:17.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:57:18 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:57:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:19.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:19.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:20 np0005539510 nova_compute[231979]: 2025-11-29 06:57:20.533 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:20 np0005539510 nova_compute[231979]: 2025-11-29 06:57:20.533 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:20 np0005539510 podman[241094]: 2025-11-29 06:57:20.904233712 +0000 UTC m=+0.057156978 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Nov 29 01:57:20 np0005539510 podman[241092]: 2025-11-29 06:57:20.918743582 +0000 UTC m=+0.078025939 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:57:20 np0005539510 podman[241093]: 2025-11-29 06:57:20.927314363 +0000 UTC m=+0.080933088 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 29 01:57:20 np0005539510 nova_compute[231979]: 2025-11-29 06:57:20.930 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:20 np0005539510 nova_compute[231979]: 2025-11-29 06:57:20.931 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:57:20 np0005539510 nova_compute[231979]: 2025-11-29 06:57:20.931 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:57:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:21.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:57:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:21.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:57:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:23.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:23 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:57:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:23.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:25.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:25.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:26 np0005539510 nova_compute[231979]: 2025-11-29 06:57:26.762 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:57:26 np0005539510 nova_compute[231979]: 2025-11-29 06:57:26.762 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:26 np0005539510 nova_compute[231979]: 2025-11-29 06:57:26.762 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:26 np0005539510 nova_compute[231979]: 2025-11-29 06:57:26.763 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:26 np0005539510 nova_compute[231979]: 2025-11-29 06:57:26.763 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:26 np0005539510 nova_compute[231979]: 2025-11-29 06:57:26.763 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:26 np0005539510 nova_compute[231979]: 2025-11-29 06:57:26.763 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:26 np0005539510 nova_compute[231979]: 2025-11-29 06:57:26.763 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:57:26 np0005539510 nova_compute[231979]: 2025-11-29 06:57:26.764 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:27.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:27 np0005539510 nova_compute[231979]: 2025-11-29 06:57:27.173 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:27 np0005539510 nova_compute[231979]: 2025-11-29 06:57:27.174 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:27 np0005539510 nova_compute[231979]: 2025-11-29 06:57:27.174 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:27 np0005539510 nova_compute[231979]: 2025-11-29 06:57:27.175 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:57:27 np0005539510 nova_compute[231979]: 2025-11-29 06:57:27.175 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:57:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:27.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:57:27 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:57:27 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2389936592' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:57:27 np0005539510 nova_compute[231979]: 2025-11-29 06:57:27.798 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.623s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:27 np0005539510 nova_compute[231979]: 2025-11-29 06:57:27.942 231983 WARNING nova.virt.libvirt.driver [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:57:27 np0005539510 nova_compute[231979]: 2025-11-29 06:57:27.943 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5279MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:57:27 np0005539510 nova_compute[231979]: 2025-11-29 06:57:27.943 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:27 np0005539510 nova_compute[231979]: 2025-11-29 06:57:27.943 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:28 np0005539510 nova_compute[231979]: 2025-11-29 06:57:28.312 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:57:28 np0005539510 nova_compute[231979]: 2025-11-29 06:57:28.312 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:57:28 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:57:28 np0005539510 nova_compute[231979]: 2025-11-29 06:57:28.397 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:28 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:57:28 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1141100869' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:57:28 np0005539510 nova_compute[231979]: 2025-11-29 06:57:28.826 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:28 np0005539510 nova_compute[231979]: 2025-11-29 06:57:28.832 231983 DEBUG nova.compute.provider_tree [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed in ProviderTree for provider: 98b21ca7-b42c-4765-935a-26a89197ffb9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:57:28 np0005539510 nova_compute[231979]: 2025-11-29 06:57:28.942 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:57:28 np0005539510 nova_compute[231979]: 2025-11-29 06:57:28.944 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:57:28 np0005539510 nova_compute[231979]: 2025-11-29 06:57:28.944 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:29.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:29.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:31.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:31.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:33.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:33 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:57:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:33.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:33 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 29 01:57:33 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:57:33 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 29 01:57:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:35.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:35.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:36 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Nov 29 01:57:36 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2021761786' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 01:57:36 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Nov 29 01:57:36 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2021761786' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 01:57:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:37.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:37.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:38 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:57:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:39.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:39.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:57:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:41.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:57:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:57:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:41.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:57:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:43.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:43 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:57:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:43.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:45.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:45.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:57:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:47.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:57:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:47.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:48 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:57:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:49.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:57:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:49.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:57:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:57:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:51.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:57:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:57:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:51.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:57:51 np0005539510 podman[241395]: 2025-11-29 06:57:51.915004273 +0000 UTC m=+0.065799100 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:57:51 np0005539510 podman[241396]: 2025-11-29 06:57:51.923130342 +0000 UTC m=+0.072016368 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:57:51 np0005539510 podman[241394]: 2025-11-29 06:57:51.943752617 +0000 UTC m=+0.097836013 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible)
Nov 29 01:57:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:53.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:53 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:57:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:53.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:55.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:55.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:57.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:57:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:57.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:57:57 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:57:57 np0005539510 ceph-mon[77142]: from='mgr.14132 192.168.122.100:0/717556443' entity='mgr.compute-0.vxabpq' 
Nov 29 01:57:58 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:57:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:57:59.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:57:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:57:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:57:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:57:59.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:01.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:01.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:58:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:03.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:58:03 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:58:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:03.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:05.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:05 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:05 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:05 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:05.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:58:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:07.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:58:07 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:07 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:07 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:07.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:08 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:58:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:58:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:09.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:58:09 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:09 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:09 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:09.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:58:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:11.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:58:11 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:11 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:11 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:11.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:58:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:13.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:58:13 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:58:13 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:13 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:13 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:13.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000026s ======
Nov 29 01:58:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:15.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 29 01:58:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:58:15.152 143385 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:58:15.152 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:15 np0005539510 ovn_metadata_agent[143380]: 2025-11-29 06:58:15.152 143385 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:15 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:15 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:15 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:15.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:17.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:17 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:17 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:17 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:17.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:18 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:58:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:19.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:19 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:19 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:58:19 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:19.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:58:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:58:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:21.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:58:21 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:21 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:21 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:21.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:22 np0005539510 podman[241624]: 2025-11-29 06:58:22.928647607 +0000 UTC m=+0.078924687 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 01:58:22 np0005539510 podman[241623]: 2025-11-29 06:58:22.958590548 +0000 UTC m=+0.110154043 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 29 01:58:22 np0005539510 podman[241622]: 2025-11-29 06:58:22.95866568 +0000 UTC m=+0.110880913 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:58:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:23.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:23 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:58:23 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:23 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:23 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:23.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:23 np0005539510 nova_compute[231979]: 2025-11-29 06:58:23.861 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:23 np0005539510 nova_compute[231979]: 2025-11-29 06:58:23.862 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:23 np0005539510 nova_compute[231979]: 2025-11-29 06:58:23.862 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:58:23 np0005539510 nova_compute[231979]: 2025-11-29 06:58:23.862 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:58:23 np0005539510 nova_compute[231979]: 2025-11-29 06:58:23.941 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:58:23 np0005539510 nova_compute[231979]: 2025-11-29 06:58:23.941 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:23 np0005539510 nova_compute[231979]: 2025-11-29 06:58:23.942 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:23 np0005539510 nova_compute[231979]: 2025-11-29 06:58:23.942 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:23 np0005539510 nova_compute[231979]: 2025-11-29 06:58:23.942 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:58:23 np0005539510 nova_compute[231979]: 2025-11-29 06:58:23.942 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:24 np0005539510 nova_compute[231979]: 2025-11-29 06:58:24.137 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:24 np0005539510 nova_compute[231979]: 2025-11-29 06:58:24.137 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:24 np0005539510 nova_compute[231979]: 2025-11-29 06:58:24.138 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:24 np0005539510 nova_compute[231979]: 2025-11-29 06:58:24.139 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:58:24 np0005539510 nova_compute[231979]: 2025-11-29 06:58:24.139 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:58:24 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:58:24 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2010625727' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:58:24 np0005539510 nova_compute[231979]: 2025-11-29 06:58:24.616 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:58:24 np0005539510 nova_compute[231979]: 2025-11-29 06:58:24.813 231983 WARNING nova.virt.libvirt.driver [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:58:24 np0005539510 nova_compute[231979]: 2025-11-29 06:58:24.814 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5283MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:58:24 np0005539510 nova_compute[231979]: 2025-11-29 06:58:24.815 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:24 np0005539510 nova_compute[231979]: 2025-11-29 06:58:24.815 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:25.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:25 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:25 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:25 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:25.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:25 np0005539510 nova_compute[231979]: 2025-11-29 06:58:25.921 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:58:25 np0005539510 nova_compute[231979]: 2025-11-29 06:58:25.922 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:58:25 np0005539510 nova_compute[231979]: 2025-11-29 06:58:25.941 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:58:26 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Nov 29 01:58:26 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3947950218' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 29 01:58:26 np0005539510 nova_compute[231979]: 2025-11-29 06:58:26.396 231983 DEBUG oslo_concurrency.processutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:58:26 np0005539510 nova_compute[231979]: 2025-11-29 06:58:26.404 231983 DEBUG nova.compute.provider_tree [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed in ProviderTree for provider: 98b21ca7-b42c-4765-935a-26a89197ffb9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:58:26 np0005539510 nova_compute[231979]: 2025-11-29 06:58:26.481 231983 DEBUG nova.scheduler.client.report [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Inventory has not changed for provider 98b21ca7-b42c-4765-935a-26a89197ffb9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:58:26 np0005539510 nova_compute[231979]: 2025-11-29 06:58:26.483 231983 DEBUG nova.compute.resource_tracker [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:58:26 np0005539510 nova_compute[231979]: 2025-11-29 06:58:26.483 231983 DEBUG oslo_concurrency.lockutils [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.668s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:26 np0005539510 nova_compute[231979]: 2025-11-29 06:58:26.484 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:27.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:27 np0005539510 nova_compute[231979]: 2025-11-29 06:58:27.502 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:27 np0005539510 nova_compute[231979]: 2025-11-29 06:58:27.503 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:27 np0005539510 nova_compute[231979]: 2025-11-29 06:58:27.503 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:27 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:27 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:27 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:27.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:28 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:58:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:29.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:29 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:29 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:58:29 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:29.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:58:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:31.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:31 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:31 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:58:31 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:31.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:58:32 np0005539510 nova_compute[231979]: 2025-11-29 06:58:32.861 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:32 np0005539510 nova_compute[231979]: 2025-11-29 06:58:32.861 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 01:58:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:33.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:33 np0005539510 nova_compute[231979]: 2025-11-29 06:58:33.165 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 01:58:33 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:58:33 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:33 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:33 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:33.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:58:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:35.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:58:35 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:35 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:35 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:35.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:36 np0005539510 systemd-logind[784]: New session 52 of user zuul.
Nov 29 01:58:36 np0005539510 systemd[1]: Started Session 52 of User zuul.
Nov 29 01:58:36 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Nov 29 01:58:36 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1396494908' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 29 01:58:36 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Nov 29 01:58:36 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1396494908' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 29 01:58:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:37.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:37 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:37 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:58:37 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:37.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:58:38 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:58:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:39.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:39 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:39 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:39 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:39.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:39 np0005539510 nova_compute[231979]: 2025-11-29 06:58:39.862 231983 DEBUG oslo_service.periodic_task [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:39 np0005539510 nova_compute[231979]: 2025-11-29 06:58:39.862 231983 DEBUG nova.compute.manager [None req-323cd8ab-76b8-4690-9ddc-06ad15517150 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 01:58:39 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0) v1
Nov 29 01:58:39 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/857796520' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Nov 29 01:58:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:58:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:41.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:58:41 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:41 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:41 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:41.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:43.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:43 np0005539510 ovs-vsctl[242120]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 29 01:58:43 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:43 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:43 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:43.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:43 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:58:44 np0005539510 virtqemud[231501]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 29 01:58:44 np0005539510 virtqemud[231501]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 29 01:58:44 np0005539510 virtqemud[231501]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 29 01:58:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:45.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:45 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:45 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:45 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:45.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:45 np0005539510 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy asok_command: cache status {prefix=cache status} (starting...)
Nov 29 01:58:45 np0005539510 lvm[242444]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 29 01:58:45 np0005539510 lvm[242444]: VG ceph_vg0 finished
Nov 29 01:58:45 np0005539510 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy asok_command: client ls {prefix=client ls} (starting...)
Nov 29 01:58:46 np0005539510 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy asok_command: damage ls {prefix=damage ls} (starting...)
Nov 29 01:58:46 np0005539510 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy asok_command: dump loads {prefix=dump loads} (starting...)
Nov 29 01:58:46 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "report"} v 0) v1
Nov 29 01:58:46 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2741286316' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Nov 29 01:58:46 np0005539510 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Nov 29 01:58:46 np0005539510 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Nov 29 01:58:46 np0005539510 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Nov 29 01:58:47 np0005539510 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Nov 29 01:58:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:47.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:47 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Nov 29 01:58:47 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1529673451' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 29 01:58:47 np0005539510 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Nov 29 01:58:47 np0005539510 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy asok_command: get subtrees {prefix=get subtrees} (starting...)
Nov 29 01:58:47 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:47 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:47 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:47.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:47 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config log"} v 0) v1
Nov 29 01:58:47 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4200116878' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Nov 29 01:58:47 np0005539510 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy asok_command: ops {prefix=ops} (starting...)
Nov 29 01:58:47 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0) v1
Nov 29 01:58:47 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3706317250' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Nov 29 01:58:48 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0) v1
Nov 29 01:58:48 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/352369704' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Nov 29 01:58:48 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Nov 29 01:58:48 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/446556486' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Nov 29 01:58:48 np0005539510 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy asok_command: session ls {prefix=session ls} (starting...)
Nov 29 01:58:48 np0005539510 ceph-mds[83861]: mds.cephfs.compute-2.gxdwyy asok_command: status {prefix=status} (starting...)
Nov 29 01:58:48 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:58:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:49.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:49 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Nov 29 01:58:49 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2498518273' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Nov 29 01:58:49 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "features"} v 0) v1
Nov 29 01:58:49 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2393270283' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Nov 29 01:58:49 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:49 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:49 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:49.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:49 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Nov 29 01:58:49 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/258398947' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Nov 29 01:58:49 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0) v1
Nov 29 01:58:49 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2584480393' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Nov 29 01:58:50 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Nov 29 01:58:50 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3995147939' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Nov 29 01:58:50 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0) v1
Nov 29 01:58:50 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1377123572' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Nov 29 01:58:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:51.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:51 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:51 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:58:51 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:51.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 129 pg[9.1d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Reset 0.000186 1 0.000250
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 129 pg[9.1d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 129 pg[9.1d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Start
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 129 pg[9.1d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 129 pg[9.1d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] exit Start 0.000009 0 0.000000
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 129 pg[9.1d( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] enter Started/Stray
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 129 handle_osd_map epochs [127,129], i have 129, src has [1,129]
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 804350 data_alloc: 218103808 data_used: 270336
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67936256 unmapped: 1212416 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 129 heartbeat osd_stat(store_statfs(0x1bceaa000/0x0/0x1bfc00000, data 0xc453d/0x183000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2bcf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67936256 unmapped: 1212416 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 scrub-queue::remove_from_osd_queue removing pg[9.1d] failed. State was: not registered w/ OSD
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 129 heartbeat osd_stat(store_statfs(0x1bceaa000/0x0/0x1bfc00000, data 0xc453d/0x183000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2bcf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67936256 unmapped: 1212416 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 129 handle_osd_map epochs [129,130], i have 129, src has [1,130]
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67944448 unmapped: 1204224 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67944448 unmapped: 1204224 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 130 handle_osd_map epochs [130,131], i have 130, src has [1,131]
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 131 pg[9.1d( v 56'1130 lc 0'0 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY m=5 mbc={}] exit Started/Stray 5.068034 8 0.000145
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 131 pg[9.1d( v 56'1130 lc 0'0 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY m=5 mbc={}] enter Started/ReplicaActive
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 131 pg[9.1d( v 56'1130 lc 0'0 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 crt=56'1130 mlcod 0'0 remapped NOTIFY m=5 mbc={}] enter Started/ReplicaActive/RepNotRecovering
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 scrub-queue::remove_from_osd_queue removing pg[9.1d] failed. State was: not registered w/ OSD
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 131 pg[9.1d( v 56'1130 lc 54'528 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 luod=0'0 crt=56'1130 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.013292 4 0.000222
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 131 pg[9.1d( v 56'1130 lc 54'528 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 luod=0'0 crt=56'1130 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] enter Started/ReplicaActive/RepWaitRecoveryReserved
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 131 pg[9.1d( v 56'1130 lc 54'528 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 luod=0'0 crt=56'1130 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] exit Started/ReplicaActive/RepWaitRecoveryReserved 0.000087 1 0.000116
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 131 pg[9.1d( v 56'1130 lc 54'528 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 luod=0'0 crt=56'1130 lcod 0'0 mlcod 0'0 active+remapped m=5 mbc={}] enter Started/ReplicaActive/RepRecovering
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 131 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 luod=0'0 crt=56'1130 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepRecovering 0.216606 1 0.000085
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 131 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 luod=0'0 crt=56'1130 mlcod 0'0 active+remapped mbc={}] enter Started/ReplicaActive/RepNotRecovering
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 818484 data_alloc: 218103808 data_used: 270336
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 1187840 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 131 handle_osd_map epochs [131,132], i have 131, src has [1,132]
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.497914314s of 10.499240875s, submitted: 21
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 luod=0'0 crt=56'1130 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.604952 1 0.000083
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 luod=0'0 crt=56'1130 mlcod 0'0 active+remapped mbc={}] exit Started/ReplicaActive 0.835138 0 0.000000
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 luod=0'0 crt=56'1130 mlcod 0'0 active+remapped mbc={}] exit Started 5.903227 0 0.000000
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=129) [2]/[0] r=-1 lpr=129 pi=[93,129)/1 luod=0'0 crt=56'1130 mlcod 0'0 active+remapped mbc={}] enter Reset
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 luod=0'0 crt=56'1130 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] exit Reset 0.000147 1 0.000220
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] enter Started
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] enter Start
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] exit Start 0.000007 0 0.000000
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] enter Started/Primary
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.005875 2 0.000435
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=0/0 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 132 handle_osd_map epochs [132,132], i have 132, src has [1,132]
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: merge_log_dups log.dups.size()=0olog.dups.size()=36
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: end of merge_log_dups changed=1 log.dups.size()=0 olog.dups.size()=36
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=129/131 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000746 2 0.000106
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=129/131 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=129/131 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 132 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=129/131 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 1187840 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 132 handle_osd_map epochs [132,133], i have 132, src has [1,133]
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 132 handle_osd_map epochs [133,133], i have 133, src has [1,133]
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 132 handle_osd_map epochs [132,133], i have 133, src has [1,133]
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 133 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=129/131 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.495460 2 0.000095
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 133 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=129/131 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.502198 0 0.000000
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 133 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=129/131 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 133 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=132/133 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 133 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=132/133 n=5 ec=58/47 lis/c=129/93 les/c/f=131/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 133 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=132/133 n=5 ec=58/47 lis/c=132/93 les/c/f=133/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.005318 4 0.000260
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 133 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=132/133 n=5 ec=58/47 lis/c=132/93 les/c/f=133/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 133 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=132/133 n=5 ec=58/47 lis/c=132/93 les/c/f=133/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000030 0 0.000000
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 pg_epoch: 133 pg[9.1d( v 56'1130 (0'0,56'1130] local-lis/les=132/133 n=5 ec=58/47 lis/c=132/93 les/c/f=133/94/0 sis=132) [2] r=0 lpr=132 pi=[93,132)/1 crt=56'1130 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 1187840 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 133 heartbeat osd_stat(store_statfs(0x1bca8e000/0x0/0x1bfc00000, data 0xc992a/0x18e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 133 heartbeat osd_stat(store_statfs(0x1bca8c000/0x0/0x1bfc00000, data 0xcb456/0x191000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67969024 unmapped: 1179648 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 133 handle_osd_map epochs [133,134], i have 133, src has [1,134]
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67969024 unmapped: 1179648 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 827726 data_alloc: 218103808 data_used: 270336
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 134 heartbeat osd_stat(store_statfs(0x1bca88000/0x0/0x1bfc00000, data 0xccf65/0x194000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67977216 unmapped: 1171456 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67977216 unmapped: 1171456 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 134 heartbeat osd_stat(store_statfs(0x1bca88000/0x0/0x1bfc00000, data 0xccf65/0x194000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 134 heartbeat osd_stat(store_statfs(0x1bca88000/0x0/0x1bfc00000, data 0xccf65/0x194000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67977216 unmapped: 1171456 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.1d scrub starts
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: log_channel(cluster) log [DBG] : 9.1d scrub ok
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67985408 unmapped: 1163264 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 134 heartbeat osd_stat(store_statfs(0x1bca88000/0x0/0x1bfc00000, data 0xccf65/0x194000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67985408 unmapped: 1163264 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 134 heartbeat osd_stat(store_statfs(0x1bca8a000/0x0/0x1bfc00000, data 0xccf65/0x194000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 827322 data_alloc: 218103808 data_used: 270336
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67985408 unmapped: 1163264 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 134 heartbeat osd_stat(store_statfs(0x1bca8a000/0x0/0x1bfc00000, data 0xccf65/0x194000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67985408 unmapped: 1163264 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67993600 unmapped: 1155072 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 134 heartbeat osd_stat(store_statfs(0x1bca8a000/0x0/0x1bfc00000, data 0xccf65/0x194000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67993600 unmapped: 1155072 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 134 heartbeat osd_stat(store_statfs(0x1bca8a000/0x0/0x1bfc00000, data 0xccf65/0x194000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68001792 unmapped: 1146880 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 827322 data_alloc: 218103808 data_used: 270336
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67985408 unmapped: 1163264 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 134 handle_osd_map epochs [134,135], i have 134, src has [1,135]
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.387766838s of 15.429096222s, submitted: 16
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67993600 unmapped: 1155072 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 135 heartbeat osd_stat(store_statfs(0x1bca8a000/0x0/0x1bfc00000, data 0xccf65/0x194000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67936256 unmapped: 1212416 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67936256 unmapped: 1212416 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67944448 unmapped: 1204224 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 135 heartbeat osd_stat(store_statfs(0x1bca86000/0x0/0x1bfc00000, data 0xcebbe/0x197000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 831496 data_alloc: 218103808 data_used: 278528
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67944448 unmapped: 1204224 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67952640 unmapped: 1196032 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 135 handle_osd_map epochs [135,136], i have 135, src has [1,136]
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67952640 unmapped: 1196032 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 1187840 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 136 heartbeat osd_stat(store_statfs(0x1bca83000/0x0/0x1bfc00000, data 0xd0851/0x19a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 1187840 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 834470 data_alloc: 218103808 data_used: 278528
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 1187840 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67969024 unmapped: 1179648 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 136 handle_osd_map epochs [136,137], i have 136, src has [1,137]
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.889656067s of 10.965355873s, submitted: 7
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67977216 unmapped: 1171456 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 137 heartbeat osd_stat(store_statfs(0x1bca83000/0x0/0x1bfc00000, data 0xd0851/0x19a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 137 heartbeat osd_stat(store_statfs(0x1bca80000/0x0/0x1bfc00000, data 0xd2379/0x19d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67977216 unmapped: 1171456 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67985408 unmapped: 1163264 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 837444 data_alloc: 218103808 data_used: 278528
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67993600 unmapped: 1155072 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67993600 unmapped: 1155072 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 137 heartbeat osd_stat(store_statfs(0x1bca80000/0x0/0x1bfc00000, data 0xd2379/0x19d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 67993600 unmapped: 1155072 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68001792 unmapped: 1146880 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68001792 unmapped: 1146880 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 137 handle_osd_map epochs [137,138], i have 137, src has [1,138]
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 840418 data_alloc: 218103808 data_used: 278528
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68009984 unmapped: 1138688 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68009984 unmapped: 1138688 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68018176 unmapped: 1130496 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 138 heartbeat osd_stat(store_statfs(0x1bca7d000/0x0/0x1bfc00000, data 0xd3e9d/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 138 heartbeat osd_stat(store_statfs(0x1bca7d000/0x0/0x1bfc00000, data 0xd3e9d/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68018176 unmapped: 1130496 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 138 heartbeat osd_stat(store_statfs(0x1bca7d000/0x0/0x1bfc00000, data 0xd3e9d/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 138 heartbeat osd_stat(store_statfs(0x1bca7d000/0x0/0x1bfc00000, data 0xd3e9d/0x1a0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68018176 unmapped: 1130496 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 138 handle_osd_map epochs [138,139], i have 138, src has [1,139]
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.057117462s of 13.097633362s, submitted: 6
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68026368 unmapped: 1122304 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68026368 unmapped: 1122304 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68034560 unmapped: 1114112 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68034560 unmapped: 1114112 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68042752 unmapped: 1105920 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68042752 unmapped: 1105920 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68042752 unmapped: 1105920 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68050944 unmapped: 1097728 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68050944 unmapped: 1097728 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68050944 unmapped: 1097728 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68059136 unmapped: 1089536 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68059136 unmapped: 1089536 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68050944 unmapped: 1097728 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68059136 unmapped: 1089536 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68059136 unmapped: 1089536 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68059136 unmapped: 1089536 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68067328 unmapped: 1081344 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68067328 unmapped: 1081344 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68075520 unmapped: 1073152 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68075520 unmapped: 1073152 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68075520 unmapped: 1073152 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68083712 unmapped: 1064960 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68083712 unmapped: 1064960 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68083712 unmapped: 1064960 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68091904 unmapped: 1056768 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68100096 unmapped: 1048576 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68108288 unmapped: 1040384 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68108288 unmapped: 1040384 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68116480 unmapped: 1032192 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68116480 unmapped: 1032192 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68116480 unmapped: 1032192 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68116480 unmapped: 1032192 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68124672 unmapped: 1024000 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68124672 unmapped: 1024000 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68124672 unmapped: 1024000 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68132864 unmapped: 1015808 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68141056 unmapped: 1007616 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68141056 unmapped: 1007616 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68141056 unmapped: 1007616 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68149248 unmapped: 999424 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68149248 unmapped: 999424 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68157440 unmapped: 991232 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68157440 unmapped: 991232 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68165632 unmapped: 983040 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 974848 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 974848 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68182016 unmapped: 966656 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68182016 unmapped: 966656 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68182016 unmapped: 966656 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 958464 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 958464 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 958464 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68198400 unmapped: 950272 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68198400 unmapped: 950272 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68206592 unmapped: 942080 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68206592 unmapped: 942080 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68206592 unmapped: 942080 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68214784 unmapped: 933888 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68214784 unmapped: 933888 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68214784 unmapped: 933888 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68222976 unmapped: 925696 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68222976 unmapped: 925696 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68231168 unmapped: 917504 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68231168 unmapped: 917504 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68231168 unmapped: 917504 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68239360 unmapped: 909312 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68239360 unmapped: 909312 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68239360 unmapped: 909312 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68247552 unmapped: 901120 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68247552 unmapped: 901120 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68255744 unmapped: 892928 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68255744 unmapped: 892928 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68255744 unmapped: 892928 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68255744 unmapped: 892928 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 884736 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 884736 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 876544 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 876544 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 876544 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 868352 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 868352 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 860160 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 860160 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 851968 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 851968 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 851968 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68304896 unmapped: 843776 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68304896 unmapped: 843776 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68304896 unmapped: 843776 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68313088 unmapped: 835584 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68313088 unmapped: 835584 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68321280 unmapped: 827392 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68321280 unmapped: 827392 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68321280 unmapped: 827392 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68329472 unmapped: 819200 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68329472 unmapped: 819200 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68329472 unmapped: 819200 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68337664 unmapped: 811008 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68337664 unmapped: 811008 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68337664 unmapped: 811008 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68345856 unmapped: 802816 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68345856 unmapped: 802816 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68354048 unmapped: 794624 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68354048 unmapped: 794624 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68354048 unmapped: 794624 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68362240 unmapped: 786432 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68362240 unmapped: 786432 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68362240 unmapped: 786432 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68370432 unmapped: 778240 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68370432 unmapped: 778240 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68378624 unmapped: 770048 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68378624 unmapped: 770048 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68378624 unmapped: 770048 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68378624 unmapped: 770048 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68386816 unmapped: 761856 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68386816 unmapped: 761856 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68395008 unmapped: 753664 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68395008 unmapped: 753664 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68403200 unmapped: 745472 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68378624 unmapped: 770048 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68378624 unmapped: 770048 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68395008 unmapped: 753664 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68395008 unmapped: 753664 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68395008 unmapped: 753664 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68403200 unmapped: 745472 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68403200 unmapped: 745472 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68411392 unmapped: 737280 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68411392 unmapped: 737280 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68411392 unmapped: 737280 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68419584 unmapped: 729088 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68419584 unmapped: 729088 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68419584 unmapped: 729088 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68427776 unmapped: 720896 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68427776 unmapped: 720896 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68427776 unmapped: 720896 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68435968 unmapped: 712704 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68435968 unmapped: 712704 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68435968 unmapped: 712704 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68444160 unmapped: 704512 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68444160 unmapped: 704512 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68452352 unmapped: 696320 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68452352 unmapped: 696320 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68460544 unmapped: 688128 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68460544 unmapped: 688128 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68460544 unmapped: 688128 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68476928 unmapped: 671744 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68476928 unmapped: 671744 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68476928 unmapped: 671744 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68485120 unmapped: 663552 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68493312 unmapped: 655360 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68501504 unmapped: 647168 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68501504 unmapped: 647168 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68501504 unmapped: 647168 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68509696 unmapped: 638976 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68509696 unmapped: 638976 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68509696 unmapped: 638976 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 630784 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 630784 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 622592 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 622592 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 622592 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68534272 unmapped: 614400 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68534272 unmapped: 614400 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 606208 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 606208 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 606208 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 598016 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 598016 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68558848 unmapped: 589824 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68558848 unmapped: 589824 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68558848 unmapped: 589824 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68567040 unmapped: 581632 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68567040 unmapped: 581632 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68567040 unmapped: 581632 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68583424 unmapped: 565248 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68583424 unmapped: 565248 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68591616 unmapped: 557056 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68591616 unmapped: 557056 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68599808 unmapped: 548864 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68599808 unmapped: 548864 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68599808 unmapped: 548864 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68599808 unmapped: 548864 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68608000 unmapped: 540672 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68608000 unmapped: 540672 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68608000 unmapped: 540672 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68616192 unmapped: 532480 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68616192 unmapped: 532480 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68624384 unmapped: 524288 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68624384 unmapped: 524288 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68624384 unmapped: 524288 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68632576 unmapped: 516096 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68632576 unmapped: 516096 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68640768 unmapped: 507904 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68640768 unmapped: 507904 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68648960 unmapped: 499712 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68657152 unmapped: 491520 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68657152 unmapped: 491520 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68665344 unmapped: 483328 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68665344 unmapped: 483328 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68657152 unmapped: 491520 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68665344 unmapped: 483328 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68665344 unmapped: 483328 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68665344 unmapped: 483328 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68673536 unmapped: 475136 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68673536 unmapped: 475136 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68673536 unmapped: 475136 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68681728 unmapped: 466944 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68681728 unmapped: 466944 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68689920 unmapped: 458752 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68689920 unmapped: 458752 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68689920 unmapped: 458752 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68698112 unmapped: 450560 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68698112 unmapped: 450560 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68706304 unmapped: 442368 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68706304 unmapped: 442368 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68714496 unmapped: 434176 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68714496 unmapped: 434176 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68714496 unmapped: 434176 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68722688 unmapped: 425984 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68722688 unmapped: 425984 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68722688 unmapped: 425984 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68730880 unmapped: 417792 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68730880 unmapped: 417792 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 5353 writes, 23K keys, 5353 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 5353 writes, 712 syncs, 7.52 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5353 writes, 23K keys, 5353 commit groups, 1.0 writes per commit group, ingest: 18.68 MB, 0.03 MB/s#012Interval WAL: 5353 writes, 712 syncs, 7.52 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_s
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68796416 unmapped: 352256 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68796416 unmapped: 352256 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68796416 unmapped: 352256 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68796416 unmapped: 352256 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68804608 unmapped: 344064 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68804608 unmapped: 344064 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68812800 unmapped: 335872 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68812800 unmapped: 335872 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68812800 unmapped: 335872 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68820992 unmapped: 327680 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68820992 unmapped: 327680 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68829184 unmapped: 319488 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68829184 unmapped: 319488 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68837376 unmapped: 311296 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68845568 unmapped: 303104 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68845568 unmapped: 303104 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68853760 unmapped: 294912 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68845568 unmapped: 303104 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68845568 unmapped: 303104 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68853760 unmapped: 294912 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68853760 unmapped: 294912 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68853760 unmapped: 294912 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68861952 unmapped: 286720 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68861952 unmapped: 286720 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68870144 unmapped: 278528 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68870144 unmapped: 278528 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68878336 unmapped: 270336 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68878336 unmapped: 270336 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68878336 unmapped: 270336 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68886528 unmapped: 262144 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68886528 unmapped: 262144 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68894720 unmapped: 253952 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68894720 unmapped: 253952 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68894720 unmapped: 253952 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68902912 unmapped: 245760 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68902912 unmapped: 245760 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68902912 unmapped: 245760 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68902912 unmapped: 245760 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68911104 unmapped: 237568 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68911104 unmapped: 237568 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68919296 unmapped: 229376 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68919296 unmapped: 229376 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68919296 unmapped: 229376 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68927488 unmapped: 221184 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68927488 unmapped: 221184 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68927488 unmapped: 221184 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68935680 unmapped: 212992 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68935680 unmapped: 212992 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68943872 unmapped: 204800 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68943872 unmapped: 204800 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68943872 unmapped: 204800 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68952064 unmapped: 196608 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68952064 unmapped: 196608 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68960256 unmapped: 188416 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68960256 unmapped: 188416 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68968448 unmapped: 180224 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68968448 unmapped: 180224 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68968448 unmapped: 180224 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68976640 unmapped: 172032 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68976640 unmapped: 172032 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68976640 unmapped: 172032 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68993024 unmapped: 155648 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 68993024 unmapped: 155648 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 69001216 unmapped: 147456 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 69001216 unmapped: 147456 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 69001216 unmapped: 147456 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 69009408 unmapped: 139264 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843712 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 69009408 unmapped: 139264 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 69009408 unmapped: 139264 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 69017600 unmapped: 131072 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7a000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 69017600 unmapped: 131072 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 69017600 unmapped: 131072 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 296.232727051s of 296.242431641s, submitted: 3
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 69148672 unmapped: 0 heap: 69148672 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 69476352 unmapped: 1769472 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 69500928 unmapped: 1744896 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 69599232 unmapped: 1646592 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 69632000 unmapped: 1613824 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 70705152 unmapped: 540672 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 466944 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 70803456 unmapped: 442368 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 70909952 unmapped: 335872 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 70983680 unmapped: 262144 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71000064 unmapped: 245760 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71000064 unmapped: 245760 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71000064 unmapped: 245760 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71000064 unmapped: 245760 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71000064 unmapped: 245760 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71000064 unmapped: 245760 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71000064 unmapped: 245760 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71008256 unmapped: 237568 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71008256 unmapped: 237568 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71016448 unmapped: 229376 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71016448 unmapped: 229376 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 221184 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 221184 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 221184 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71032832 unmapped: 212992 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71032832 unmapped: 212992 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71041024 unmapped: 204800 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71041024 unmapped: 204800 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71049216 unmapped: 196608 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71065600 unmapped: 180224 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71073792 unmapped: 172032 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71081984 unmapped: 163840 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71081984 unmapped: 163840 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 147456 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 147456 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71106560 unmapped: 139264 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71106560 unmapped: 139264 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71106560 unmapped: 139264 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71114752 unmapped: 131072 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71114752 unmapped: 131072 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71114752 unmapped: 131072 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71139328 unmapped: 106496 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71139328 unmapped: 106496 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 98304 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 98304 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71155712 unmapped: 90112 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71155712 unmapped: 90112 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71155712 unmapped: 90112 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71172096 unmapped: 73728 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71172096 unmapped: 73728 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71172096 unmapped: 73728 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71172096 unmapped: 73728 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71172096 unmapped: 73728 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71180288 unmapped: 65536 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71180288 unmapped: 65536 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71180288 unmapped: 65536 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71180288 unmapped: 65536 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71180288 unmapped: 65536 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71188480 unmapped: 57344 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71188480 unmapped: 57344 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71188480 unmapped: 57344 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71188480 unmapped: 57344 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71188480 unmapped: 57344 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71188480 unmapped: 57344 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71188480 unmapped: 57344 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71188480 unmapped: 57344 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71188480 unmapped: 57344 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71188480 unmapped: 57344 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 49152 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 49152 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 49152 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 49152 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 49152 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 49152 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 49152 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 40960 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 40960 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 40960 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 24576 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 16384 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 0 heap: 71245824 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71254016 unmapped: 1040384 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71254016 unmapped: 1040384 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71254016 unmapped: 1040384 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71254016 unmapped: 1040384 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71254016 unmapped: 1040384 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71254016 unmapped: 1040384 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71254016 unmapped: 1040384 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71254016 unmapped: 1040384 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71254016 unmapped: 1040384 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1032192 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71270400 unmapped: 1024000 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71270400 unmapped: 1024000 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71270400 unmapped: 1024000 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71270400 unmapped: 1024000 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71270400 unmapped: 1024000 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1015808 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1015808 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1015808 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1015808 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1015808 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1015808 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1015808 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1015808 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1015808 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1015808 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1015808 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1015808 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1007616 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1007616 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1007616 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1007616 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1007616 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1007616 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1007616 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 991232 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 991232 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 991232 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 991232 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 991232 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 991232 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 991232 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 991232 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 991232 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 991232 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 983040 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 983040 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 983040 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 983040 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 974848 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 974848 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 974848 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 974848 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 974848 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 974848 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 974848 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 966656 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 958464 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 958464 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 958464 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 958464 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 958464 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 958464 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 958464 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 958464 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 958464 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 958464 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 958464 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 958464 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71335936 unmapped: 958464 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 950272 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 950272 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 950272 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 950272 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 950272 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 950272 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 942080 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71360512 unmapped: 933888 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71360512 unmapped: 933888 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71360512 unmapped: 933888 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71360512 unmapped: 933888 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71385088 unmapped: 909312 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71385088 unmapped: 909312 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71385088 unmapped: 909312 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71385088 unmapped: 909312 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 901120 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 901120 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 901120 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 901120 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 901120 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 892928 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 892928 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 892928 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 892928 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 892928 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 876544 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 876544 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 876544 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 876544 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 876544 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 876544 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 876544 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 876544 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71426048 unmapped: 868352 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 843776 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 843776 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 843776 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 843776 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 843776 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 843776 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 843776 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 843776 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 843776 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 843776 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 843776 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71475200 unmapped: 819200 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 811008 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 811008 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 811008 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 811008 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 811008 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 811008 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 811008 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 811008 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71499776 unmapped: 794624 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71499776 unmapped: 794624 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71499776 unmapped: 794624 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71499776 unmapped: 794624 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71499776 unmapped: 794624 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71499776 unmapped: 794624 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 786432 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 786432 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 786432 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 786432 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 786432 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 786432 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 786432 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 786432 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 786432 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 786432 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 786432 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 786432 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 786432 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 786432 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 770048 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 770048 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 770048 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 770048 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 770048 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 761856 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 761856 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 761856 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 761856 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 761856 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 761856 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 761856 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 761856 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 761856 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 761856 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 761856 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 761856 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 761856 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 761856 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 761856 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71548928 unmapped: 745472 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71548928 unmapped: 745472 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71548928 unmapped: 745472 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71548928 unmapped: 745472 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71548928 unmapped: 745472 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 737280 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 737280 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 737280 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71565312 unmapped: 729088 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71565312 unmapped: 729088 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71589888 unmapped: 704512 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71589888 unmapped: 704512 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71589888 unmapped: 704512 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71589888 unmapped: 704512 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71589888 unmapped: 704512 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71589888 unmapped: 704512 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71589888 unmapped: 704512 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71589888 unmapped: 704512 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71589888 unmapped: 704512 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71589888 unmapped: 704512 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 688128 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 688128 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 688128 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 688128 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 688128 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 688128 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 688128 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 688128 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 688128 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 688128 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 679936 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 679936 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 679936 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 679936 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 679936 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 679936 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 679936 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 679936 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 679936 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 679936 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 663552 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 647168 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 647168 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 647168 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 647168 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 647168 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 647168 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71655424 unmapped: 638976 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71655424 unmapped: 638976 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71655424 unmapped: 638976 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71655424 unmapped: 638976 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71680000 unmapped: 614400 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71680000 unmapped: 614400 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71680000 unmapped: 614400 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71680000 unmapped: 614400 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71680000 unmapped: 614400 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 606208 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 606208 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 606208 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 606208 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 606208 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 589824 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 581632 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71729152 unmapped: 565248 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71729152 unmapped: 565248 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71729152 unmapped: 565248 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71729152 unmapped: 565248 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 630784 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 606208 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 606208 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 606208 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 606208 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 606208 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 606208 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 606208 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 606208 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 606208 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 606208 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 581632 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 581632 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 581632 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 581632 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 581632 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 581632 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 581632 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 581632 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 581632 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 581632 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71729152 unmapped: 565248 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71729152 unmapped: 565248 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71729152 unmapped: 565248 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71729152 unmapped: 565248 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 557056 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 557056 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 557056 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 557056 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 557056 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 557056 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 557056 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 557056 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 557056 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 557056 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 557056 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 557056 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 557056 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 557056 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 557056 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 532480 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 532480 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 532480 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 532480 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 532480 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 524288 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 524288 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 524288 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 524288 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 524288 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 524288 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 524288 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 524288 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 524288 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 524288 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 524288 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 524288 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 524288 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 524288 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 524288 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 507904 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 507904 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 507904 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 507904 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 507904 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 507904 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 507904 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 507904 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 507904 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 507904 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 507904 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 491520 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71811072 unmapped: 483328 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71811072 unmapped: 483328 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71811072 unmapped: 483328 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71811072 unmapped: 483328 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71811072 unmapped: 483328 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71811072 unmapped: 483328 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71811072 unmapped: 483328 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71811072 unmapped: 483328 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 5796 writes, 24K keys, 5796 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 5796 writes, 923 syncs, 6.28 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 443 writes, 694 keys, 443 commit groups, 1.0 writes per commit group, ingest: 0.22 MB, 0.00 MB/s#012Interval WAL: 443 writes, 211 syncs, 2.10 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55fecfc77350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_s
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71827456 unmapped: 466944 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 450560 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71860224 unmapped: 434176 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 417792 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71892992 unmapped: 401408 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 71892992 unmapped: 401408 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 599.242004395s of 600.520812988s, submitted: 232
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72032256 unmapped: 262144 heap: 72294400 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72261632 unmapped: 1081344 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 933888 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 933888 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 933888 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 933888 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 933888 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 933888 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 933888 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 933888 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 933888 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72409088 unmapped: 933888 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 925696 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 925696 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 925696 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 925696 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 925696 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 925696 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72417280 unmapped: 925696 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72425472 unmapped: 917504 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72425472 unmapped: 917504 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72425472 unmapped: 917504 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72425472 unmapped: 917504 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72425472 unmapped: 917504 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72433664 unmapped: 909312 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72433664 unmapped: 909312 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72433664 unmapped: 909312 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72433664 unmapped: 909312 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72433664 unmapped: 909312 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72433664 unmapped: 909312 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72433664 unmapped: 909312 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72433664 unmapped: 909312 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72433664 unmapped: 909312 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72433664 unmapped: 909312 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 901120 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 892928 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 892928 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 892928 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 892928 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 892928 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 892928 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 892928 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 892928 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 892928 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 892928 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 892928 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 892928 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 892928 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 892928 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 892928 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72458240 unmapped: 884736 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 876544 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72482816 unmapped: 860160 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72499200 unmapped: 843776 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72499200 unmapped: 843776 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72499200 unmapped: 843776 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72499200 unmapped: 843776 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72499200 unmapped: 843776 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72499200 unmapped: 843776 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72499200 unmapped: 843776 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72499200 unmapped: 843776 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72499200 unmapped: 843776 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72499200 unmapped: 843776 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72507392 unmapped: 835584 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72507392 unmapped: 835584 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72507392 unmapped: 835584 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72507392 unmapped: 835584 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72507392 unmapped: 835584 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72507392 unmapped: 835584 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72507392 unmapped: 835584 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72507392 unmapped: 835584 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72507392 unmapped: 835584 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72507392 unmapped: 835584 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 rsyslogd[1003]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72523776 unmapped: 819200 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 802816 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72556544 unmapped: 786432 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:51 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72581120 unmapped: 761856 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72597504 unmapped: 745472 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72605696 unmapped: 737280 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72605696 unmapped: 737280 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72605696 unmapped: 737280 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72605696 unmapped: 737280 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72638464 unmapped: 704512 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72654848 unmapped: 688128 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72654848 unmapped: 688128 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72663040 unmapped: 679936 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72679424 unmapped: 663552 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72679424 unmapped: 663552 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72679424 unmapped: 663552 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72679424 unmapped: 663552 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72679424 unmapped: 663552 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72679424 unmapped: 663552 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72679424 unmapped: 663552 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72679424 unmapped: 663552 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72679424 unmapped: 663552 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72679424 unmapped: 663552 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72679424 unmapped: 663552 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72679424 unmapped: 663552 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72695808 unmapped: 647168 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72695808 unmapped: 647168 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72695808 unmapped: 647168 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72695808 unmapped: 647168 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72695808 unmapped: 647168 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72695808 unmapped: 647168 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72695808 unmapped: 647168 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72695808 unmapped: 647168 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72712192 unmapped: 630784 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 614400 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 614400 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 614400 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 614400 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 614400 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 606208 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 606208 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 606208 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 606208 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 606208 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 606208 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 606208 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 606208 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 606208 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 606208 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 606208 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 606208 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 606208 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 606208 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72736768 unmapped: 606208 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72753152 unmapped: 589824 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72753152 unmapped: 589824 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72753152 unmapped: 589824 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72753152 unmapped: 589824 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72753152 unmapped: 589824 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72753152 unmapped: 589824 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72753152 unmapped: 589824 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72753152 unmapped: 589824 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 581632 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 581632 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 581632 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 581632 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 581632 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 581632 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 581632 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 581632 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 581632 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 581632 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 581632 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72761344 unmapped: 581632 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 548864 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72818688 unmapped: 524288 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72818688 unmapped: 524288 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72818688 unmapped: 524288 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72818688 unmapped: 524288 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72818688 unmapped: 524288 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 516096 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 516096 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 516096 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 516096 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 516096 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 516096 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 516096 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 516096 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 516096 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 516096 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 516096 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 516096 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 516096 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 516096 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72826880 unmapped: 516096 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72843264 unmapped: 499712 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72843264 unmapped: 499712 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72843264 unmapped: 499712 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72843264 unmapped: 499712 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72843264 unmapped: 499712 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72843264 unmapped: 499712 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72843264 unmapped: 499712 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72843264 unmapped: 499712 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72843264 unmapped: 499712 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 6280 writes, 25K keys, 6280 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 6280 writes, 1161 syncs, 5.41 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 484 writes, 738 keys, 484 commit groups, 1.0 writes per commit group, ingest: 0.24 MB, 0.00 MB/s#012Interval WAL: 484 writes, 238 syncs, 2.03 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72802304 unmapped: 540672 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: mgrc ms_handle_reset ms_handle_reset con 0x55fed1842000
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/1221624088
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/1221624088,v1:192.168.122.100:6801/1221624088]
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: mgrc handle_mgr_configure stats_period=5
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72925184 unmapped: 417792 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 599.588073730s of 600.260498047s, submitted: 246
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72933376 unmapped: 409600 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [0,0,0,0,0,0,3])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72957952 unmapped: 385024 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 72974336 unmapped: 368640 heap: 73342976 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [1,0,1,1])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74162176 unmapped: 229376 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 81920 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74317824 unmapped: 73728 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74326016 unmapped: 65536 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74268672 unmapped: 122880 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74276864 unmapped: 114688 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74285056 unmapped: 106496 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74293248 unmapped: 98304 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74301440 unmapped: 90112 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 81920 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 81920 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 81920 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 81920 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 81920 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 81920 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 81920 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 81920 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74309632 unmapped: 81920 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74317824 unmapped: 73728 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74317824 unmapped: 73728 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74317824 unmapped: 73728 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74317824 unmapped: 73728 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74317824 unmapped: 73728 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74317824 unmapped: 73728 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74317824 unmapped: 73728 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74317824 unmapped: 73728 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74317824 unmapped: 73728 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74334208 unmapped: 57344 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74342400 unmapped: 49152 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74358784 unmapped: 32768 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74375168 unmapped: 16384 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74391552 unmapped: 0 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74391552 unmapped: 0 heap: 74391552 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74424320 unmapped: 1015808 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74440704 unmapped: 999424 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74457088 unmapped: 983040 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74473472 unmapped: 966656 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: bluestore.MempoolThread(0x55fecfd55b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842832 data_alloc: 218103808 data_used: 286720
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74481664 unmapped: 958464 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74678272 unmapped: 761856 heap: 75440128 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: do_command 'config diff' '{prefix=config diff}'
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: do_command 'config show' '{prefix=config show}'
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: do_command 'counter dump' '{prefix=counter dump}'
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: osd.2 139 heartbeat osd_stat(store_statfs(0x1bca7b000/0x0/0x1bfc00000, data 0xd59ac/0x1a3000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: do_command 'counter schema' '{prefix=counter schema}'
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 74924032 unmapped: 1564672 heap: 76488704 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: prioritycache tune_memory target: 4294967296 mapped: 75235328 unmapped: 1253376 heap: 76488704 old mem: 2845415832 new mem: 2845415832
Nov 29 01:58:52 np0005539510 ceph-osd[79822]: do_command 'log dump' '{prefix=log dump}'
Nov 29 01:58:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:53.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:53 np0005539510 podman[243336]: 2025-11-29 06:58:53.390706896 +0000 UTC m=+0.080409318 container health_status d45765539066b12c036d0b55796617bdf3b0490035c2feb1a552f5e2fe5651d0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:58:53 np0005539510 podman[243334]: 2025-11-29 06:58:53.415723523 +0000 UTC m=+0.107004878 container health_status b155ad061a58ae5bd9a36dcdeb133f04f23285e6cf78a1de7d8f603e10bb2108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 29 01:58:53 np0005539510 podman[243329]: 2025-11-29 06:58:53.426207267 +0000 UTC m=+0.118027886 container health_status 4274087b5691318f202baa0f581f38faf78a6a6e4653fd17a3dd37e9752b6947 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 29 01:58:53 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:53 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:58:53 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:53.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:58:53 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:58:53 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0) v1
Nov 29 01:58:53 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1383921042' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Nov 29 01:58:53 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0) v1
Nov 29 01:58:53 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2433791124' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Nov 29 01:58:54 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Nov 29 01:58:54 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1819404255' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Nov 29 01:58:54 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Nov 29 01:58:54 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3266793758' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Nov 29 01:58:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:55.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:55 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:55 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:55 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:55.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:55 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Nov 29 01:58:55 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/800220038' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Nov 29 01:58:56 np0005539510 systemd[1]: Starting Hostname Service...
Nov 29 01:58:56 np0005539510 systemd[1]: Started Hostname Service.
Nov 29 01:58:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:57.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:57 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:57 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.001000027s ======
Nov 29 01:58:57 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:57.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 29 01:58:58 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 29 01:58:58 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0) v1
Nov 29 01:58:58 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2561335456' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Nov 29 01:58:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:58:59.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:59 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:58:59 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:58:59 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:58:59.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:58:59 np0005539510 ceph-mds[83861]: mds.beacon.cephfs.compute-2.gxdwyy missed beacon ack from the monitors
Nov 29 01:58:59 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Nov 29 01:58:59 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2442087058' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Nov 29 01:58:59 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0) v1
Nov 29 01:58:59 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4281976021' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Nov 29 01:59:00 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0) v1
Nov 29 01:59:00 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4203124747' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Nov 29 01:59:00 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Nov 29 01:59:00 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2772849280' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Nov 29 01:59:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:59:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:59:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:59:01.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:59:01 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:59:01 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:59:01 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.100 - anonymous [29/Nov/2025:06:59:01.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 29 01:59:02 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0) v1
Nov 29 01:59:02 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1430961509' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Nov 29 01:59:02 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0) v1
Nov 29 01:59:02 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3531646775' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Nov 29 01:59:03 np0005539510 ceph-mon[77142]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0) v1
Nov 29 01:59:03 np0005539510 ceph-mon[77142]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2078930364' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Nov 29 01:59:03 np0005539510 radosgw[83467]: ====== starting new request req=0x7fda342bd6f0 =====
Nov 29 01:59:03 np0005539510 radosgw[83467]: ====== req done req=0x7fda342bd6f0 op status=0 http_status=200 latency=0.000000000s ======
Nov 29 01:59:03 np0005539510 radosgw[83467]: beast: 0x7fda342bd6f0: 192.168.122.102 - anonymous [29/Nov/2025:06:59:03.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
